My project works perfectly in mode 'sandbox' but when I go to put it in mode 'live' (I did it correctly as indicated in the instructions in PayPal-PHP-SDK, my credentials are correct and I put mode 'live' instead of mode 'sandbox').
It gives me the following error:
PayPal \ Exception \ PayPalConnectionException
Got Http response code 401 when accessing https://api.sandbox.paypal.com/v1/oauth2/token.
Looking for this error notice that it happened to others but it was the following error:
PayPal \ Exception \ PayPalConnectionException
Got Http response code 401 when accessing https://api.paypal.com/v1/oauth2/token.
Why does my error say sandbox if I have mode 'live'? So I started looking because if I had mode 'live' I kept getting api.sandbox.paypal.com instead of api.paypal.com error.
and get to vendor \ paypal \ rest-api-sdk-php \ lib \ PayPal \ Handler \ OauthHandler.php (Which I have never modified) has a _getEndPoint method.
private static function _getEndpoint($config)
{
if (isset($config['oauth.EndPoint'])) {
$baseEndpoint = $config['oauth.EndPoint'];
} elseif (isset($config['service.EndPoint'])) {
$baseEndpoint = $config['service.EndPoint'];
} elseif (isset($config['mode'])) {
switch (strtoupper($config['mode'])) {
case 'SANDBOX':
$baseEndpoint = PayPalConstants::REST_SANDBOX_ENDPOINT;
break;
case 'LIVE':
$baseEndpoint = PayPalConstants::REST_LIVE_ENDPOINT;
break;
default:
throw new PayPalConfigurationException('The mode config parameter must be set to either sandbox/live');
}
} else {
// Defaulting to Sandbox
$baseEndpoint = PayPalConstants::REST_SANDBOX_ENDPOINT;
}
$baseEndpoint = rtrim(trim($baseEndpoint), '/') . "/v1/oauth2/token";
return $baseEndpoint;
}
I noticed that the config always arrives empty in that part, when it arrives empty it goes to the case by default
which is sandbox. That's why sandobox works for me even though nothing is coming either.
Any idea why this can happen. I really have no idea, any help is welcome.
I think you need mode 'production' for that SDK's config
But you are using a deprecated v1 SDK that is no longer maintained
You should be using the v2 Checkout-PHP-SDK, documented here: https://developer.paypal.com/docs/checkout/reference/server-integration/
By the way, the best UI approval flow to pair it with is here: https://developer.paypal.com/demo/checkout/#/pattern/server
I have a solution, it is not a very elegant one but it works for me. In vendor\paypal\rest-api-sdk-php\lib\PayPal\Core\PayPalsConstants.php change the variable REST_SANDBOX_ENDPOINT from "http://api.sandbox.paypal.com" to "http://api.paypal.com"
Related
I have created two Azure httpTrigger functions and serve them over https. During local development when I call azure function 2 from azure function 1 I get the following message:
The SSL connection could not be established, see inner exception.
The remote certificate is invalid according to the validation procedure.
After looking for a solution I found this (solution 1) and this (solution 2)
I tried the first solution (shown below) and it did not make a difference (Aside: I'm glad as I don't like removing the security checks for a call)
ServicePointManager.ServerCertificateValidationCallback += (sender, certificate, chain, errors) =>
{
var isDevelopment = false;
#if DEBUG
isDevelopment = true;
#endif
if (isDevelopment) return true;
return errors == SslPolicyErrors.None;
};
I considered solution 2 but when my application starts up it clearly states:
Generating a self signed certificate using openssl
My question is how do I call azure function 2 from azure function 1 without disabling ServerCertificateValidationCallback
UPDATE:
I created a certificate manually and it continued to return the same error. I have managed to supress the error for local development by replacing ServicePointManager.ServerCertificateValidationCallback with ConfigurePrimaryHttpMessageHandler when I set up my httpClient. Which now looks like below. But I would still like to know how to make the call without this being needed
services.AddHttpClient<ILocationDetailsService, LocationDetailsService>(client =>
{
var writeBaseUrl = configuration.GetValue<string>("WriteBaseUrl");
client.BaseAddress = new Uri(writeBaseUrl); // get url from config
client.DefaultRequestHeaders.Add("ContentType", "application/json");
})
.ConfigurePrimaryHttpMessageHandler(() =>
new HttpClientHandler()
{
ServerCertificateCustomValidationCallback = (sender, cert, chain, sslPolicyErrors) => {
var isDevelopment = false;
#if DEBUG
isDevelopment = true;
#endif
if (isDevelopment) return true;
return sslPolicyErrors == SslPolicyErrors.None;
}
}
)
UPDATE 2:
#John Wu has suggested that I identify the error by navigating to the url in the browser. In firefox I get:
https://localhost:7072/api/contact
The certificate is not trusted because it is self-signed.
Error code: MOZILLA_PKIX_ERROR_SELF_SIGNED_CERT
In chrome I get
NET::ERR_CERT_AUTHORITY_INVALID
Looks like I have my answer. Once I resolve it I will update with and answer. On a side note, it looks like all my endpoint are doing the same, I had been assuming that they were all working without errors until now. Thanks #John Wu
The default ASP.NET Web Api Core behaviour for unauthorized request is to send 401/403 error with empty content. I'd like to change it by specifying some kind of Json response specifying the error.
But I struggle to find a right place where I can introduce these changes. Official documentation is of no help (read it all). I had a guess that may be I could catch UnathorizedException in my exception filter / middleware but it didn't work out (I guess it gets handled at authorization level or even not thrown at all).
So my question is how can I customize response behavior in case of unauthorized request.
With .Net Core 3 (or may be earlier as well) you can write a middleware to check if the context.Response has a status of 40x and then return a custom object. Below is roughly how I did it:
if (context.Response.StatusCode == (int)HttpStatusCode.Unauthorized)
{
var result = new MyStandardApiResponseDto
{
Error = new MyErrorDto
{
Title = "Unauthorized",
Messages = new List<string> { "You are not authorized to access the resource. Please login again." },
},
Result = null
};
await context.Response.WriteAsync(JsonConvert.SerializeObject(result));
}
I am trying to check whether the Native app is installed or not , If it is not I have to prompt the user to download it from the webpage. For chrome I used to achieve by checking the error messages from runtime.LastError. However in case of Firefox it gives error only in console No such native application extension_name and not catching it in the runtime.LastError method.
Is there any way that we can identify whether corresponding Native app is installed or not ?
I am facing issue when Native app is not installed and browser.runtime.lastError is not giving any error.
Can you please suggest if there is any way in Firefox Webextension that we can catch such errors and identify it in code whether the corresponding Native app is installed or not on the user machine.
It will really helpful if someone can provide some info on this.
for e.g. :
startNativeApp: function(force){
// note that when the native app is opened and ready, it will call "_ABC_onAgentReady"
ABC.log('Starting native app.');
if (!ABC.appConnected) {
try {
ABC.nativeAppPort = browser.runtime.connectNative(_ABC_native_app_id);
ABC.nativeAppPort.onMessage.addListener(ABC.onNativeMessageReceived);
ABC.nativeAppPort.onDisconnect.addListener(ABC.onNativeAppDisconnected);
ABC.appInstalled = true;
ABC.appConnected = true;
} catch(e) {
ABC.log('Error starting native app: ' + e.message, 'ERR');
}
} else if (force === true) {
ABC.log('Native app is already running; attempting to stop and will restart in 750ms.');
ABC.stopNativeApp();
setTimeout(function() { ABC.startNativeApp(true); }, 750);
}
},
onNativeAppDisconnected: function(message) {
console.log("ABC LastError : "+browser.runtime.lastError);
console.log("ABC LastError : "+ABC.nativeAppPort.error);
console.log("ABC LastError : "+JSON.stringify(message));
ABC.appConnected = false;
ABC.nativeAppPort = null;
ABC.appInstalled = false;
if (browser.runtime.lastError && (browser.runtime.lastError.message.indexOf("No such native application") !== -1 )) {
ABC.appInstalled = false;
}
// cleanup: reset the sig data so that it is re-requested on the next scan
_ABC_sa_data = "";
_ABC_sigs = "";
if (browser.storage && browser.storage.local) {
browser.storage.local.set({ uid: _ABC_be_uid }, null);
}
ABC.log('Send message to page to stop.');
ABC.sendMessageToPage({ onNativeAppDisconnected: '' });
ABC.log('Native app disconnected.');
},
Issue here was that port.error was not giving any error response in Firefox versions less than 52 , Due to which I was facing problem in identifying whether native app is installed or not.
After discussion on Mozilla Community (https://discourse.mozilla-community.org/t/firefox-native-messaging-runtime-lasterror-not-giving-any-errors-in-case-of-no-native-application-installed-on-connectnative/12880/4) , we found that it is actually missed and a bug is already reported : https://bugzilla.mozilla.org/show_bug.cgi?id=12994116
which will be resolved in Firefox 52.
However , I need to support Firefox 50 also , so the alternate I am using is to call native application in starting to find out whether it is installed or not.
If I got back response than it is installed otherwise it is not.
However specific error messages will be available from Firefox52.
Right now at chrome 109 the following approaches won't work after connectNative:
chrome.runtime.lastError. The error is printed because it is visible in the log but right after the call it is undefined.
console.error = function (arg) {/**/}. Is not working to replace the default function.
port.name is "" in both cases (error or no error).
port.onDisconnect is not called if the application is missing.
The only solution left is to call a third checker:
const promise=chrome.runtime.sendNativeMessage("appname", { /*text: ""*/ });//,check_response
promise.then(check_response,check_error);
In Firefox there is no runtime.lastError.
The listener function you pass to runtime.Port.onDisconnect isn't passed the message, it's passed the port itself.
You then want port.error.
See the documentation for onDisconnect here https://developer.mozilla.org/en-US/Add-ons/WebExtensions/API/runtime/Port
I'm looking to deploy an auto-update feature to an Electron installation that I have, however I am finding it difficult to find any resources on the web.
I've built a self contained application using Adobe Air before and it seemed to be a lot easier writing update code that effectively checked a url and automatically downloaded and installed the update across Windows and MAC OSX.
I am currently using the electron-boilerplate for ease of build.
I have a few questions:
How do I debug the auto update feature? Do I setup a local connection and test through that using a local Node server or can I use any web server?
In terms of signing the application I am only looking to run apps on MAC OSX and particularly Windows. Do I have to sign the applications in order to run auto-updates? (I managed to do this with Adobe Air using a local certificate.
Are there any good resources that detail how to implement the auto-update feature? As I'm having difficulty finding some good documentation on how to do this.
I am also new to Electron but I think there is no simple auto-update from electron-boilerplate (which I also use). Electron's auto-updater uses Squirrel.Windows installer which you also need to implement into your solution in order to use it.
I am currently trying to use this:
https://www.npmjs.com/package/electron-installer-squirrel-windows
And more info can be found here:
https://github.com/atom/electron/blob/master/docs/api/auto-updater.md
https://github.com/squirrel/squirrel.windows
EDIT: I just opened the project to try it for a while and it looks it works. Its pretty straightforward. These are pieces from my gulpfile.
In current configuration, I use electron-packager to create a package.
var packager = require('electron-packager')
var createPackage = function () {
var deferred = Q.defer();
packager({
//OPTIONS
}, function done(err, appPath) {
if (err) {
gulpUtil.log(err);
}
deferred.resolve();
});
return deferred.promise;
};
Then I create an installer with electron-installer-squirrel-windows.
var squirrelBuilder = require('electron-installer-squirrel-windows');
var createInstaller = function () {
var deferred = Q.defer();
squirrelBuilder({
// OPTIONS
}, function (err) {
if (err)
gulpUtil.log(err);
deferred.resolve();
});
return deferred.promise;
}
Also you need to add some code for the Squirrel to your electron background/main code. I used a template electron-squirrel-startup.
if(require('electron-squirrel-startup')) return;
The whole thing is described on the electron-installer-squirrel-windows npm documentation mentioned above. Looks like the bit of documentation is enough to make it start.
Now I am working on with electron branding through Squirrel and with creating appropriate gulp scripts for automation.
You could also use standard Electron's autoUpdater module on OS X and my simple port of it for Windows: https://www.npmjs.com/package/electron-windows-updater
I followed this tutorial and got it working with my electron app although it needs to be signed to work so you would need:
certificateFile: './path/to/cert.pfx'
In the task config.
and:
"build": {
"win": {
"certificateFile": "./path/to/cert.pfx",
"certificatePassword": "password"
}
},
In the package.json
Are there any good resources that detail how to implement the auto-update feature? As I'm having difficulty finding some good documentation on how to do this.
You don't have to implement it by yourself. You can use the provided autoUpdater by Electron and just set a feedUrl. You need a server that provides the update information compliant to the Squirrel protocol.
There are a couple of self-hosted ones (https://electronjs.org/docs/tutorial/updates#deploying-an-update-server) or a hosted service like https://www.update.rocks
Question 1:
I use Postman to validate that my auto-update server URLs return the response I am expecting. When I know that the URLs provide the expected results, I know I can use those URLs within the Electron's Auto Updater of my Application.
Example of testing Mac endpoint with Postman:
Request:
https://my-server.com/api/macupdates/checkforupdate.php?appversion=1.0.5&cpuarchitecture=x64
JSON Response when there is an update available:
{
"url": "https:/my-server.com/updates/darwin/x64/my-electron=app-x64-1.1.0.zip",
"name": "1.1.0",
"pub_date": "2021-07-03T15:17:12+00:00"
}
Question 2:
Yes, your Electron App must be code signed to use the auto-update feature on Mac. On Windows I'm not sure because my Windows Electron app is code signed and I did not try without it. Though it is recommended that you sign your app even if the auto-update could work without it (not only for security reasons but mainly because otherwise your users will get scary danger warnings from Windows when they install your app for the first time and they might just delete it right away).
Question 3:
For good documentation, you should start with the official Electron Auto Updater documentation, as of 2021-07-07 it is really good.
The hard part, is figuring out how to make things work for Mac. For Windows it's a matter of minutes and you are done. In fact...
For Windows auto-update, it is easy to setup - you just have to put the RELEASES and nupkg files on a server and then use that URL as the FeedURL within your Electron App's autoUpdater. So if your app's update files are located at https://my-server.com/updates/win32/x64/ - you would point the Electron Auto Updater to that URL, that's it.
For Mac auto-update, you need to manually specify the absolute URL of the latest Electron App .zip file to the Electron autoUpdater. So, in order to make the Mac autoUpdater work, you will need to have a way to get a JSON response in a very specific format. Sadly, you can't just put your Electron App's files on your server and expect it to work with Mac just like that. Instead, the autoUpdater needs a URL that will return the aforementioned JSON response. So to do that, you need to pass Electron's Auto Updater feedURL the URL that will be able to return this expected kind of JSON response.
The way you achieve this can be anything but I use PHP just because that's the server I already paid for.
So in summary, with Mac, even if your files are located at https://my-server.com/updates/darwin/x64/ - you will not provide that URL to Electron's Auto Updater FeedURL. Instead will provide another URL which returns the expected JSON response.
Here's an example of my main.js file for the Electron main process of my App:
// main.js (Electron main process)
function registerAutoUpdater() {
const appVersion = app.getVersion();
const os = require('os');
const cpuArchitecture = os.arch();
const domain = 'https://my-server.com';
const windowsURL = `${domain}/updates/win32/x64`;
const macURL = `${domain}/api/macupdates/checkforupdate.php?appversion=${appVersion}&cpuarchitecture=${cpuArchitecture}`;
//init the autoUpdater with proper update feed URL
const autoUpdateURL = `${isMac ? macURL : windowsURL}`;
autoUpdater.setFeedURL({url: autoUpdateURL});
log.info('Registered autoUpdateURL = ' + (isMac ? 'macURL' : 'windowsURL'));
//initial checkForUpdates
autoUpdater.checkForUpdates();
//Automatic 2-hours interval loop checkForUpdates
setInterval(() => {
autoUpdater.checkForUpdates();
}, 7200000);
}
And here's an example of the checkforupdate.php file that returns the expected JSON response back to the Electron Auto Updater:
<?php
//FD Electron App Mac auto update API endpoint.
// The way Squirrel.Mac works is by checking a given API endpoint to see if there is a new version.
// If there is no new version, the endpoint should return HTTP 204. If there is a new version,
// however, it will expect a HTTP 200 JSON-formatted response, containing a url to a .zip file:
// https://github.com/Squirrel/Squirrel.Mac#server-support
$clientAppVersion = $_GET["appversion"] ?? null;
if (!isValidVersionString($clientAppVersion)) {
http_response_code(204);
exit();
}
$clientCpuArchitecture = $_GET["cpuarchitecture"] ?? null;
$latestVersionInfo = getLatestVersionInfo($clientAppVersion, $clientCpuArchitecture);
if (!isset($latestVersionInfo["versionNumber"])) {
http_response_code(204);
exit();
}
// Real logic starts here when basics did not fail
$isUpdateVailable = isUpdateAvailable($clientAppVersion, $latestVersionInfo["versionNumber"]);
if ($isUpdateVailable) {
http_response_code(200);
header('Content-Type: application/json;charset=utf-8');
$jsonResponse = array(
"url" => $latestVersionInfo["directZipFileURL"],
"name" => $latestVersionInfo["versionNumber"],
"pub_date" => date('c', $latestVersionInfo["createdAtUnixTimeStamp"]),
);
echo json_encode($jsonResponse);
} else {
//no update: must respond with a status code of 204 No Content.
http_response_code(204);
}
exit();
// End of execution.
// Everything bellow here are function declarations.
function getLatestVersionInfo($clientAppVersion, $clientCpuArchitecture): array {
// override path if client requests an arm64 build
if ($clientCpuArchitecture === 'arm64') {
$directory = "../../updates/darwin/arm64/";
$baseUrl = "https://my-server.com/updates/darwin/arm64/";
} else if (!$clientCpuArchitecture || $clientCpuArchitecture === 'x64') {
$directory = "../../updates/darwin/";
$baseUrl = "https://my-server.com/updates/darwin/";
}
// default name with version 0.0.0 avoids failing
$latestVersionFileName = "Finance D - Tenue de livres-darwin-x64-0.0.0.zip";
$arrayOfFiles = scandir($directory);
foreach ($arrayOfFiles as $file) {
if (is_file($directory . $file)) {
$serverFileVersion = getVersionNumberFromFileName($file);
if (isVersionNumberGreater($serverFileVersion, $clientAppVersion)) {
$latestVersionFileName = $file;
}
}
}
return array(
"versionNumber" => getVersionNumberFromFileName($latestVersionFileName),
"directZipFileURL" => $baseUrl . rawurlencode($latestVersionFileName),
"createdAtUnixTimeStamp" => filemtime(realpath($directory . $latestVersionFileName))
);
}
function isUpdateAvailable($clientVersion, $serverVersion): bool {
return
isValidVersionString($clientVersion) &&
isValidVersionString($serverVersion) &&
isVersionNumberGreater($serverVersion, $clientVersion);
}
function getVersionNumberFromFileName($fileName) {
// extract the version number with regEx replacement
return preg_replace("/Finance D - Tenue de livres-darwin-(x64|arm64)-|\.zip/", "", $fileName);
}
function removeAllNonDigits($semanticVersionString) {
// use regex replacement to keep only numeric values in the semantic version string
return preg_replace("/\D+/", "", $semanticVersionString);
}
function isVersionNumberGreater($serverFileVersion, $clientFileVersion): bool {
// receives two semantic versions (1.0.4) and compares their numeric value (104)
// true when server version is greater than client version (105 > 104)
return removeAllNonDigits($serverFileVersion) > removeAllNonDigits($clientFileVersion);
}
function isValidVersionString($versionString) {
// true when matches semantic version numbering: 0.0.0
return preg_match("/\d\.\d\.\d/", $versionString);
}
Background: I'm working on an api which I host on ec2 servers. I just finish the login and set up an nginx loadbalancer which redirect to the said server's internal ip's. The domain name points to the load balancer.
This used to work well with code igniter, but now I keep getting an "invalid host" problem.
I tried googling it and it found some things about trusted proxies so I installed what fideloper made and tried his post as well (I've followed a guide by fideloper on laravel-4-trusted-proxies and used and tried his trusted sample on github: fideloper/TrustedProxy) but I still get the same error:
UnexpectedValueException
Invalid Host "api.myserver.im, api.myserver.im"
// as the host can come from the user (HTTP_HOST and depending on the configuration, SERVER_NAME too can come from the user)
// check that it does not contain forbidden characters (see RFC 952 and RFC 2181)
if ($host && !preg_match('/^\[?(?:[a-zA-Z0-9-:\]_]+\.?)+$/', $host)) {
throw new \UnexpectedValueException(sprintf('Invalid Host "%s"', $host));
}
Can someone help me?
I had the same issue as well. I had to resort to modifying the UrlGenerator.php file, which is part of the framework (bad I know...) just to get this to work.
So here's my "temporary" solution.
Create an array value to your app.php config file. e.g:
return array(
'rooturl' => 'https://www.youractualdomainname.com',
...
Next add the below modification in your UrlGenerator.php file <-- (trunk/vendor/laravel/framework/src/Illuminate/Routing/UrlGenerator.php)
<?php namespace Illuminate\Routing;
use Config;
...
protected function getRootUrl($scheme, $root = null)
{
$approoturl = Config::get('app.rooturl');
$root = isset($approoturl) ? $approoturl : $this->request->root();
return $root;
// if (is_null($root))
// {
// $root = $this->forcedRoot ?: $this->request->root();
// }
// $start = starts_with($root, 'http://') ? 'http://' : 'https://';
// return preg_replace('~'.$start.'~', $scheme, $root, 1);
}
Do note that composer update will revert your modification.