PHP composer download signature validation - composer-php

I need to get PHP composer installed, but I'm not prepared to curl | php, and I want to validate the downloaded packages against a signature or checksum.
The download links on the site are the way I want to go. The devs have also published their public keys on the site. And there are package signatures available at ${download}.sig (found them by just looking for them), however I can't figure out how to verify with these signatures.
For example these are the current latest PHAR and sig files:
https://getcomposer.org/download/1.0.0-beta1/composer.phar
https://getcomposer.org/download/1.0.0-beta1/composer.phar.sig
The sig file contains:
{"sha384":"FGC1jaYN4TCnaaeha3aeHx1W8gn/GyBaNk09TEOxLXjhWdwFb7psJBtZgXEsrq1Sm0J91j3l2AZDWofQ1s1FHD6A4cZY5H2KQ7KleqIFmDWDVyASHc6tjvaPtlQJ4BCVJEOPsRHX2NTH1roCs48t7S+MbVj5j5K8oVggEw9IOG4uurABUiadOLj/gQ3UpXz1+oflkr358qCkQuUW2upMuHDto8BNLSDYrCLgct1i8aCTCgKo6BYMBGSZxQdGY/dDyRX6rHbR1/CzfJmECgA9qGgeXxDRyjFg/93wsQfuFPCijd6vTpRmsFKYpwfQXMult8t+0mPh4PcvX1GzKtYcLxmsA2MmyPVfX80KtGp2EF5ExRAxOqZtd3ZtwqqOxUeNUfESKrXif1v0PxVGlER4KX5MBvCH9UvwwUPOzyplJ8N+4ybtNGfHiOD3MpPsiVBVoWkQouI5qbHRT39kAGKfMQBDWounrwMGGQV2Ca2/bFMcnInYkXFyLD12yekluoktpBcyFyZcHOJXXbbMGeXLZn3cepBwneUPklB4Q6zkouIdCkZZIzyOkLp4XgCP55idmD+DNmeoaGNlqDJmN+2wTWQv5GBj9DEEXBFHZ5f4hfn6ZEYWO7GlgOz0YeijuknvtvdCR+Iqr3Vn72UKhtoBQd2L74YwzCG/4CBYhGtknMc"}
The body of this signature seems to be base64 encoded, but decoded it's too long to be an sha384 checksum. It also doesn't appear to be a GPG signature.
How can I validate the package?

The sign script can be found on github and contains the following code:
openssl_sign(file_get_contents($_SERVER['argv'][1]), $sha384sig, $pkeyid, OPENSSL_ALGO_SHA384)
// ...
$sha384sig = trim(base64_encode($sha384sig), '=');
So the signature is indeed a base64 encoded sha384 checksum.
Please note that the installer used to download the composer phar does also check the signature. It's code can also be found on GitHub:
$signature = $httpClient->get($url.'.sig');
if (!$signature) {
out('Download failed: '.$errorHandler->message, 'error');
} else {
$signature = json_decode($signature, true);
$signature = base64_decode($signature['sha384']);
}
// ...
if (false === $disableTls) {
$pubkeyid = openssl_pkey_get_public('file://'.$home.'/' . ($version ? 'keys.tags.pub' : 'keys.dev.pub'));
$algo = defined('OPENSSL_ALGO_SHA384') ? OPENSSL_ALGO_SHA384 : 'SHA384';
if (!in_array('SHA384', openssl_get_md_methods())) {
out('SHA384 is not supported by your openssl extension, could not verify the phar file integrity', 'error');
exit(1);
}
$verified = 1 === openssl_verify(file_get_contents($file), $signature, $pubkeyid, $algo);
openssl_free_key($pubkeyid);
if (!$verified) {
out('Signature mismatch, could not verify the phar file integrity', 'error');
exit(1);
}
}

Related

How to add Ltv & CRL (offline) while injecting .p7s to a Pdf?

I inject p7s to a Pdf using code below:
PdfWriter pdfWriter = new PdfWriter("results/final1.pdf");
PdfDocument document = new PdfDocument(new PdfReader("results/prepared1.pdf"), pdfWriter, new StampingProperties().UseAppendMode());
Stream output = new FileStream("results/signed1.pdf", FileMode.Create);
ExternalInjectingSignatureContainer container2 = new ExternalInjectingSignatureContainer(_p7s);
List<byte[]> crlCollection = new List<byte[]>();
crlCollection.Add(File.ReadAllBytes(#"ks/mycrls.crl"));
PdfSigner.SignDeferred(document, "Signature1", output, container2);
Found this
I found this
I tried it as below:
ICrlClient clrClient = new CrlClientOffline(File.ReadAllBytes(#"ks/mycrls.crl"));
addLTV("results/signed1.pdf", "results/final1.pdf", null, clrClient, null);
I did not see the Ltv enabled?
but the result is: Revocation checks were not performed.
addLtv
public static void addLTV(String src, String dest, IOcspClient ocsp, ICrlClient crl, ITSAClient itsaClient)
{
PdfReader reader = new PdfReader(src);
PdfWriter writer = new PdfWriter(dest);
PdfDocument pdfDoc = new PdfDocument(reader, writer, new StampingProperties().UseAppendMode());
LtvVerification v = new LtvVerification(pdfDoc);
SignatureUtil signatureUtil = new SignatureUtil(pdfDoc);
IList<string> names = signatureUtil.GetSignatureNames();
String sigName = names[names.Count - 1];
PdfPKCS7 pkcs7 = signatureUtil.ReadSignatureData(sigName);
if (pkcs7.IsTsp())
{
v.AddVerification(sigName, ocsp, crl, LtvVerification.CertificateOption.WHOLE_CHAIN,
LtvVerification.Level.OCSP_CRL, LtvVerification.CertificateInclusion.NO);
}
else
{
foreach (var name in names)
{
v.AddVerification(name, ocsp, crl, LtvVerification.CertificateOption.WHOLE_CHAIN,
LtvVerification.Level.OCSP_CRL, LtvVerification.CertificateInclusion.YES);
v.Merge();
}
}
pdfDoc.Close();
}
ExternalInjectingSignatureContainer
internal class ExternalInjectingSignatureContainer :IExternalSignatureContainer
{
public ExternalInjectingSignatureContainer(byte[] signature)
{
Signature = signature;
}
public void ModifySigningDictionary(PdfDictionary signDic)
{
}
public byte[] Sign(Stream data)
{
return Signature;
}
public byte[] Signature;
}
I want to improve it by adding the CRL Info (Offline), I have created a .crl file but I don't know how to add the crl while injecting .p7s?
TimeStamp
I know this is not related to this question, but after this I will add a timestamp to the signature, where can I find free timestamp (for development purpose)?
any help would be appreciated..
many thanks in advance
Don
How to add Ltv & CRL (offline) while injecting .p7s to a Pdf?
This depends on the profile of the PDF signatures you create and the capabilities of the validators.
PKCS#7 Signatures as used in ISO 32000
The PDF standard, ISO 32000 both in part 1 and part 2, in section 12.8.3.3 ("PKCS#7 Signatures as used in ISO 32000" / "CMS (PKCS #7) signatures") defined a profile for CMS signatures in PDFs.
This profile requires Revocation information to be included in the CMS container as an signed attribute.
Judging by your previous questions, you create the CMS signature container itself externally. To embed CRLs according to this profile, therefore, you have to update your external code producing the CMS container or (if some service not implemented by you creates those signatures) ask the signature creation service provider to update their code producing the CMS container to include the CRL in a signed attribute as detailed in ISO 32000 section 12.8.3.3.2 ("Revocation Information" / "Revocation of CMS-based signatures").
CAdES signatures as used in PDF
ETSI originally in TS 102 778, updated in EN 319 142, defined profiles (PAdES profiles) for CAdES signatures in PDFs. CAdES is a special profile of CMS. A rundown of these profiles has been copied into the updated PDF specification ISO 32000-2, section 12.8.3.4 ("CAdES signatures as used in PDF").
These profiles require revocation information to be embedded in an incremental update after the signed revision in a Document Security Store structure of PDF objects.
To embed CRLs according to these profiles, therefore, you take the signed PDF and add the CRL afterwards. This essentially is what your addLTV example does.
Why Revocation checks were not performed
In comments you mention that you use PAdES and add the CRL using your addLTV example but that Adobe Reader tells you that "Revocation checks were not performed."
If you read the text underneath that message, the cause becomes clear:
The selected certificate does not chain up to a certificate designated as trusted anchor (see the Trust Tab for details). The result is that revocation checks were not performed on this certificate.
If your validator cannot trace your signer certificate back (in a certificate chain) to a certificate it explicitly trusts, validation stops with an unknown validity. Revocation checks only make sense if the validator trusts the issuer of the signer certificate (directly or indirectly); only in this case of trust by issuer the validator needs to verify whether the issuer revoked the certificate.

image watermark not work in Codeigniter

give text as a watermark on picture in codeigniter code can not run properly.
public function new()
{
$this->load->library('image_lib');
$config['sourse_image'] ='./SaveImage/fer.jpg';
$config['wm_text'] = 'SaleScrap';
$config['wm_type'] = 'text';
$config['wm_font_path'] = 'system/fonts/texb.ttf';
$config['wm_font_size'] = '16';
$config['wm_font_color'] = 'ffffff';
$config['wm_vrt_alignment'] = 'bottom';
$config['wm_hor_alignment'] = 'center';
$config['wm_padding'] = '20';
$this->image_lib->initialize($config);
if($this->image_lib->watermark())
{
echo "Success";
}
else
{
echo "Error";
}
}
code not work and can not give output when code run Log File give this Error
ERROR - 2018-08-29 11:49:54 --> You must specify a source image in your preferences.
ERROR - 2018-08-29 11:49:54 --> Your server does not support the GD function required to process this type of image.
SOLUTION 1
$config['sourse_image'] ='./SaveImage/fer.jpg';
correct spelling of source_image
SOLUTION 2.
You should try this structure:
$this->load->library('image_lib');
// Set your config up
$this->image_lib->initialize($config);
// Do your manipulation
$this->image_lib->clear();
If nothing works, the error might actually be the whole issue.
Check if you have gd installed, on linux you would do
sudo yum list installed | grep php
If not installed, install it
sudo yum install php-gd-package-name
RESTART your apache
sudo service httpd restart

How to download any generated by JS file from firefox addon?

I need to my extension can generate and save text file inside downloads folder. Just give me example of code how to do it.
The downloads API is what you are probably looking for:
https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/downloads
The downloads.download() function lets you download a file from a URL to your Downloads folder. Here is the example based on the Downloads.download() page.
function onStartedDownload(id) {
console.log('Started downloading: ' + id);
}
function onFailed(error) {
console.log('Download failed: ' + error);
}
var downloadUrl = "https://www.mozilla.org/media/img/home/2018/cards/irl-season-3.821df676279d.png";
var downloading = browser.downloads.download({
url : downloadUrl,
filename : 'mozilla-home.png',
conflictAction : 'uniquify'
});
downloading.then(onStartedDownload, onFailed);
If you need to download data created in Javascript, then you'll first have to create a URL for that data using URL.createObjectURL()

Auto-updates to Electron

I'm looking to deploy an auto-update feature to an Electron installation that I have, however I am finding it difficult to find any resources on the web.
I've built a self contained application using Adobe Air before and it seemed to be a lot easier writing update code that effectively checked a url and automatically downloaded and installed the update across Windows and MAC OSX.
I am currently using the electron-boilerplate for ease of build.
I have a few questions:
How do I debug the auto update feature? Do I setup a local connection and test through that using a local Node server or can I use any web server?
In terms of signing the application I am only looking to run apps on MAC OSX and particularly Windows. Do I have to sign the applications in order to run auto-updates? (I managed to do this with Adobe Air using a local certificate.
Are there any good resources that detail how to implement the auto-update feature? As I'm having difficulty finding some good documentation on how to do this.
I am also new to Electron but I think there is no simple auto-update from electron-boilerplate (which I also use). Electron's auto-updater uses Squirrel.Windows installer which you also need to implement into your solution in order to use it.
I am currently trying to use this:
https://www.npmjs.com/package/electron-installer-squirrel-windows
And more info can be found here:
https://github.com/atom/electron/blob/master/docs/api/auto-updater.md
https://github.com/squirrel/squirrel.windows
EDIT: I just opened the project to try it for a while and it looks it works. Its pretty straightforward. These are pieces from my gulpfile.
In current configuration, I use electron-packager to create a package.
var packager = require('electron-packager')
var createPackage = function () {
var deferred = Q.defer();
packager({
//OPTIONS
}, function done(err, appPath) {
if (err) {
gulpUtil.log(err);
}
deferred.resolve();
});
return deferred.promise;
};
Then I create an installer with electron-installer-squirrel-windows.
var squirrelBuilder = require('electron-installer-squirrel-windows');
var createInstaller = function () {
var deferred = Q.defer();
squirrelBuilder({
// OPTIONS
}, function (err) {
if (err)
gulpUtil.log(err);
deferred.resolve();
});
return deferred.promise;
}
Also you need to add some code for the Squirrel to your electron background/main code. I used a template electron-squirrel-startup.
if(require('electron-squirrel-startup')) return;
The whole thing is described on the electron-installer-squirrel-windows npm documentation mentioned above. Looks like the bit of documentation is enough to make it start.
Now I am working on with electron branding through Squirrel and with creating appropriate gulp scripts for automation.
You could also use standard Electron's autoUpdater module on OS X and my simple port of it for Windows: https://www.npmjs.com/package/electron-windows-updater
I followed this tutorial and got it working with my electron app although it needs to be signed to work so you would need:
certificateFile: './path/to/cert.pfx'
In the task config.
and:
"build": {
"win": {
"certificateFile": "./path/to/cert.pfx",
"certificatePassword": "password"
}
},
In the package.json
Are there any good resources that detail how to implement the auto-update feature? As I'm having difficulty finding some good documentation on how to do this.
You don't have to implement it by yourself. You can use the provided autoUpdater by Electron and just set a feedUrl. You need a server that provides the update information compliant to the Squirrel protocol.
There are a couple of self-hosted ones (https://electronjs.org/docs/tutorial/updates#deploying-an-update-server) or a hosted service like https://www.update.rocks
Question 1:
I use Postman to validate that my auto-update server URLs return the response I am expecting. When I know that the URLs provide the expected results, I know I can use those URLs within the Electron's Auto Updater of my Application.
Example of testing Mac endpoint with Postman:
Request:
https://my-server.com/api/macupdates/checkforupdate.php?appversion=1.0.5&cpuarchitecture=x64
JSON Response when there is an update available:
{
"url": "https:/my-server.com/updates/darwin/x64/my-electron=app-x64-1.1.0.zip",
"name": "1.1.0",
"pub_date": "2021-07-03T15:17:12+00:00"
}
Question 2:
Yes, your Electron App must be code signed to use the auto-update feature on Mac. On Windows I'm not sure because my Windows Electron app is code signed and I did not try without it. Though it is recommended that you sign your app even if the auto-update could work without it (not only for security reasons but mainly because otherwise your users will get scary danger warnings from Windows when they install your app for the first time and they might just delete it right away).
Question 3:
For good documentation, you should start with the official Electron Auto Updater documentation, as of 2021-07-07 it is really good.
The hard part, is figuring out how to make things work for Mac. For Windows it's a matter of minutes and you are done. In fact...
For Windows auto-update, it is easy to setup - you just have to put the RELEASES and nupkg files on a server and then use that URL as the FeedURL within your Electron App's autoUpdater. So if your app's update files are located at https://my-server.com/updates/win32/x64/ - you would point the Electron Auto Updater to that URL, that's it.
For Mac auto-update, you need to manually specify the absolute URL of the latest Electron App .zip file to the Electron autoUpdater. So, in order to make the Mac autoUpdater work, you will need to have a way to get a JSON response in a very specific format. Sadly, you can't just put your Electron App's files on your server and expect it to work with Mac just like that. Instead, the autoUpdater needs a URL that will return the aforementioned JSON response. So to do that, you need to pass Electron's Auto Updater feedURL the URL that will be able to return this expected kind of JSON response.
The way you achieve this can be anything but I use PHP just because that's the server I already paid for.
So in summary, with Mac, even if your files are located at https://my-server.com/updates/darwin/x64/ - you will not provide that URL to Electron's Auto Updater FeedURL. Instead will provide another URL which returns the expected JSON response.
Here's an example of my main.js file for the Electron main process of my App:
// main.js (Electron main process)
function registerAutoUpdater() {
const appVersion = app.getVersion();
const os = require('os');
const cpuArchitecture = os.arch();
const domain = 'https://my-server.com';
const windowsURL = `${domain}/updates/win32/x64`;
const macURL = `${domain}/api/macupdates/checkforupdate.php?appversion=${appVersion}&cpuarchitecture=${cpuArchitecture}`;
//init the autoUpdater with proper update feed URL
const autoUpdateURL = `${isMac ? macURL : windowsURL}`;
autoUpdater.setFeedURL({url: autoUpdateURL});
log.info('Registered autoUpdateURL = ' + (isMac ? 'macURL' : 'windowsURL'));
//initial checkForUpdates
autoUpdater.checkForUpdates();
//Automatic 2-hours interval loop checkForUpdates
setInterval(() => {
autoUpdater.checkForUpdates();
}, 7200000);
}
And here's an example of the checkforupdate.php file that returns the expected JSON response back to the Electron Auto Updater:
<?php
//FD Electron App Mac auto update API endpoint.
// The way Squirrel.Mac works is by checking a given API endpoint to see if there is a new version.
// If there is no new version, the endpoint should return HTTP 204. If there is a new version,
// however, it will expect a HTTP 200 JSON-formatted response, containing a url to a .zip file:
// https://github.com/Squirrel/Squirrel.Mac#server-support
$clientAppVersion = $_GET["appversion"] ?? null;
if (!isValidVersionString($clientAppVersion)) {
http_response_code(204);
exit();
}
$clientCpuArchitecture = $_GET["cpuarchitecture"] ?? null;
$latestVersionInfo = getLatestVersionInfo($clientAppVersion, $clientCpuArchitecture);
if (!isset($latestVersionInfo["versionNumber"])) {
http_response_code(204);
exit();
}
// Real logic starts here when basics did not fail
$isUpdateVailable = isUpdateAvailable($clientAppVersion, $latestVersionInfo["versionNumber"]);
if ($isUpdateVailable) {
http_response_code(200);
header('Content-Type: application/json;charset=utf-8');
$jsonResponse = array(
"url" => $latestVersionInfo["directZipFileURL"],
"name" => $latestVersionInfo["versionNumber"],
"pub_date" => date('c', $latestVersionInfo["createdAtUnixTimeStamp"]),
);
echo json_encode($jsonResponse);
} else {
//no update: must respond with a status code of 204 No Content.
http_response_code(204);
}
exit();
// End of execution.
// Everything bellow here are function declarations.
function getLatestVersionInfo($clientAppVersion, $clientCpuArchitecture): array {
// override path if client requests an arm64 build
if ($clientCpuArchitecture === 'arm64') {
$directory = "../../updates/darwin/arm64/";
$baseUrl = "https://my-server.com/updates/darwin/arm64/";
} else if (!$clientCpuArchitecture || $clientCpuArchitecture === 'x64') {
$directory = "../../updates/darwin/";
$baseUrl = "https://my-server.com/updates/darwin/";
}
// default name with version 0.0.0 avoids failing
$latestVersionFileName = "Finance D - Tenue de livres-darwin-x64-0.0.0.zip";
$arrayOfFiles = scandir($directory);
foreach ($arrayOfFiles as $file) {
if (is_file($directory . $file)) {
$serverFileVersion = getVersionNumberFromFileName($file);
if (isVersionNumberGreater($serverFileVersion, $clientAppVersion)) {
$latestVersionFileName = $file;
}
}
}
return array(
"versionNumber" => getVersionNumberFromFileName($latestVersionFileName),
"directZipFileURL" => $baseUrl . rawurlencode($latestVersionFileName),
"createdAtUnixTimeStamp" => filemtime(realpath($directory . $latestVersionFileName))
);
}
function isUpdateAvailable($clientVersion, $serverVersion): bool {
return
isValidVersionString($clientVersion) &&
isValidVersionString($serverVersion) &&
isVersionNumberGreater($serverVersion, $clientVersion);
}
function getVersionNumberFromFileName($fileName) {
// extract the version number with regEx replacement
return preg_replace("/Finance D - Tenue de livres-darwin-(x64|arm64)-|\.zip/", "", $fileName);
}
function removeAllNonDigits($semanticVersionString) {
// use regex replacement to keep only numeric values in the semantic version string
return preg_replace("/\D+/", "", $semanticVersionString);
}
function isVersionNumberGreater($serverFileVersion, $clientFileVersion): bool {
// receives two semantic versions (1.0.4) and compares their numeric value (104)
// true when server version is greater than client version (105 > 104)
return removeAllNonDigits($serverFileVersion) > removeAllNonDigits($clientFileVersion);
}
function isValidVersionString($versionString) {
// true when matches semantic version numbering: 0.0.0
return preg_match("/\d\.\d\.\d/", $versionString);
}

laravel "invalid host" on loadbalancer redirects

Background: I'm working on an api which I host on ec2 servers. I just finish the login and set up an nginx loadbalancer which redirect to the said server's internal ip's. The domain name points to the load balancer.
This used to work well with code igniter, but now I keep getting an "invalid host" problem.
I tried googling it and it found some things about trusted proxies so I installed what fideloper made and tried his post as well (I've followed a guide by fideloper on laravel-4-trusted-proxies and used and tried his trusted sample on github: fideloper/TrustedProxy) but I still get the same error:
UnexpectedValueException
Invalid Host "api.myserver.im, api.myserver.im"
// as the host can come from the user (HTTP_HOST and depending on the configuration, SERVER_NAME too can come from the user)
// check that it does not contain forbidden characters (see RFC 952 and RFC 2181)
if ($host && !preg_match('/^\[?(?:[a-zA-Z0-9-:\]_]+\.?)+$/', $host)) {
throw new \UnexpectedValueException(sprintf('Invalid Host "%s"', $host));
}
Can someone help me?
I had the same issue as well. I had to resort to modifying the UrlGenerator.php file, which is part of the framework (bad I know...) just to get this to work.
So here's my "temporary" solution.
Create an array value to your app.php config file. e.g:
return array(
'rooturl' => 'https://www.youractualdomainname.com',
...
Next add the below modification in your UrlGenerator.php file <-- (trunk/vendor/laravel/framework/src/Illuminate/Routing/UrlGenerator.php)
<?php namespace Illuminate\Routing;
use Config;
...
protected function getRootUrl($scheme, $root = null)
{
$approoturl = Config::get('app.rooturl');
$root = isset($approoturl) ? $approoturl : $this->request->root();
return $root;
// if (is_null($root))
// {
// $root = $this->forcedRoot ?: $this->request->root();
// }
// $start = starts_with($root, 'http://') ? 'http://' : 'https://';
// return preg_replace('~'.$start.'~', $scheme, $root, 1);
}
Do note that composer update will revert your modification.

Resources