Is hyperledger composer meant for production - hyperledger-composer

Based on the training videos I have watched, they all talk about the composer as something you use to "define and test your business network", but it almost feels like once its verified and tested, you throw it away and use something more serious. Am I wrong? Even term "playground" feels "experimental" -- something you mess with, but thats all. So, would you build a production system based on composer? How do you split it into multiple machines for redundancy? If composer is so slick and the "obvious" choice for building hyperledger applications, when would you NOT use it and use Fabric or Sawtooth instead? I am trying to get started and dont want to waste time on the wrong path. If you were to build a "serious" supply chain application with multiple players, what framework approach would you take? Thank you

Hyperledger Composer is not a separate platform. It uses Hyperledger Fabric "under the hood". The main problem with Composer is it does not have all the options and features of Fabric exposed in the Composer interface. So, it allows for rapid prototyping at the expense of flexibility. Composer has an impressive user interface.
Unfortunately, Composer is now in maintenance mode with no new features being put into it. See https://lists.hyperledger.org/g/composer/message/125
I would consider Hyperledger Sawtooth or Hyperledger Fabric for permissioned blockchain applications.

Related

Separating different parts of the project in Git

How can I efficiently separate different parts of the project in Git? I have a Laravel web application that includes admin panel + API for Mobile app to increase performance. I thought it would be a good idea to separate the admin part from the API to disable a service provider in API and even run the admin panel on a different server (connect to the database via remote MySQL) and dedicate a server API. How can I separate these parts without duplicating changes that I make in common parts like models? I thought of creating them as two branches in a Git repository. Is there a better way to do this separation or the whole optimization that is easier to maintain?
Update: The issue I'm facing is the response time. I put the following code into my routes, and it takes 400-600ms to respond.
Route::any('/test2', function()
{
return "test";
});
I tested it on two different servers, and the configuration is good enough, I think (10GB ram - 4 CPU core 3.6Ghz). By the way, I have less than 1k requests per hour for now, and soon I'm looking at 5k-20k at most.
I think dividing your source code into modules is good enough. Give a look to Laravel Module
I will suggest you to do as the creator of the Framework (Taylor): Packages and use Composer.
In the Laravel community, you have many Packages available like Horizon, Nova, Telescope, Spatie/* etc.
If you want to add them you just have to add a Composer dependency and it just work out of the box.
You can do the same with your code that will be in both project like Models etc.
Every Package has its own Git repo.
This is a more Laravel way to do it than separate into Module (compared to Symphony world). Laravel doesn't come with Modules at its core.
Now about separating projects:
As i read your need, i am not sure you will have performance issue if you run the API and the admin panel on the same project unless you have millions of http calls per hours.
I am currently working on a project with a lot of code for the client side, we also have an api with thousands of call per hours and everything is fine. We also run Nova for the internal backend.
You should consider that when you will have those scale problem, you will probably have database problem too and maybe servers problems (bandwith, memory, cost etc).
To be scalable 100% is not an easy task.
In my opinion, when you face it, solve it. Separating the Api/admin pannel at the beginning could be too much overhead to start/maintain a project.

Use Laravel to refactor old, large PHP application partially over time?

From what I've read this should be possible due to the modular nature of Laravel, but I need assurance from people with more Laravel experience:
I have a very large (500k loc) ancient PHP app. So ancient that some parts of it date from PHP3 times (ca. 2000, PHP4 was released already but PHP3 was used for backwards compatibility reasons).
Refactoring this is a huge project, and the only way to reasonably do it is in parts. Replace this part, then that part, etc. Fortunately, the "ancient" part comes in handy as no framework was used and basically every page is its own script, with a few central libraries for shared functionality.
Is it possible to spin up a Laravel app that can route new/refactored pages to the new site and everything else (wildcard if possible) to the ancient code? All data is stored in a database, so there will be no sync issues between them except for user authentication and session info.
Is it possible to get eloquent running on an ancient DB design or to refactor the DB in such a way that it works for both? There was a previous attempt to move the DB interface to Doctrine which from what I know was aborted after partial success (i.e. many DB objects are accessed through Doctrine, but there is also a lot of straight SQL code in parallel).
It's a huge mess, but the software in question is still being used and successfully so and a previous attempt to replace it with something else has already failed.
additional details:
Thanks #maiorano84 for good questions:
First, does your legacy application have tests?
Negative on that. PHPUnit was released in 2004. At that time, this app had already been in production for four year.
Second, are you able to get it to work on a more recent version of PHP?
Yes, the current codebase is running on PHP 5.6.33 - it has been maintained throughout the years, and a major update was made on the transition between PHP 4 and PHP 5.
If it runs on PHP 5.3+, you can use Instant Refactoring
I'm an author of Rector, a tool that can migrate huge amount of PHP files in a few seconds. E.g. upgrade PHP 5.3 to PHP 7.4, upgrade Symfony 2.8 to 4.2 or migrate from Nette to Symfony (read the case study).
1. Install Rector
composer require rector/rector --dev
2. Create rector.php with PHP sets, since you have old PHP code
// rector.php
use Rector\Core\Configuration\Option;
use Rector\Set\ValueObject\SetList;
use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator;
return static function (ContainerConfigurator $containerConfigurator):
void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Option::SETS, [
SetList::PHP_52,
// use one set at a time, as sets build on each other
// SetList::PHP_53,
// SetList::PHP_54,
]);
};
3. Then run Rector on your code (e.g. /src directory)
vendor/bin/rector process src
You can also write your own rules, so you can convert the code to Laravel/MVC approach. The idea is to write one rule, that e.g. converts 100+ files to controllers.
Read more on Github repository.
Is it possible? Yes.
Is it going to take a short amount of time? Absolutely not.
With any kind of legacy codebase, you're going to need to take the time in figuring out all of its moving parts and figuring out what portions are going to need to change in order to even be able to work on a modern platform.
The most recent version of Laravel requires PHP 7.1.3, so even attempting to just dump the entire codebase into a Laravel application is very likely going to result in failure.
First, does your legacy application have tests? These can be unit tests, integration tests, or functional tests. If not, and you want to be able to modernize your application without breaking things in the future, then you're going to want to write tests to ensure that nothing breaks as you begin upgrading. This alone can take a long time, especially with a codebase that makes it difficult to even test in the first place. Having a fully tested application will allow you to see which tests begin to fail as you start reworking your application, so this information will be extremely valuable.
Second, are you able to get it to work on a more recent version of PHP? If this code is already in production, then you're going to need to use some hardware virtualization through Vagrant, or better yet, containerization through Docker to get a local installation up and running without breaking your production code.
Once that's ready, then you should be able to begin refactoring. Taking entire pages of code and dumping them right into a Laravel application is not going to work straight out of the gate. You're going to want to start smaller. Find all of your moving parts, figure out what each one is responsible for, and encapsulate them in classes with the appropriate methods.
Use Composer's PSR-4 Autoloader to help remove all of those extra include and require statements and load your new classes throughout the application.
Use a decent Router to change all of your URLs into SEO-friendly paths and have a clearly defined entrypoint for all requests.
Move all of your business logic out of webroot: Create a /public folder in which you have just your index.php entrypoint and all public-facing assets (images, css, javascript, etc.). Since all requests are all being routed over to this file by this point, you should be able to process the request and return your response.
Once you get to a point where you've actually gotten the application into a system of well-defined components and modules, then migrating over to Laravel - or any other well-established framework - should be much easier.
This is going to take you a long time if you plan on doing it right. Hopefully this helps, and best of luck to you.
Refactoring is of course possible, but I have some doubts, if it is doable partially in this case. By partially here, I mean that, parts of the app will run sometimes on old and sometimes on new code in production.
I did this once for and old project, but not as ancient and big as yours.
In my case it was custom app (without any framework) running on php 5.3 and I was converting it to Laravel 4.2.
I must admit that there are some real challenges on the path.
This is only tip of the iceberg, but I'll try to name few of them, from what I remember:
PHP version compatibility or rather incompatibility in this case. You can rewrite existing code to run on latest PHP 7 versions. That might be a lot work however - not used in the end.
Routing and asset handling - you need to check if you can modify urls so they can fit into Laravel routing engine. It may be really hard, especially if old app is using Laravel standard paths and if you don't want to break google indexing for example. I have also seen systems with custom generators for urls which were then heavily used in views. Trying to do perfect match for these routes would be a nightmare.
Authentication. Changing auth must be done in one step, cause adapting Laravel to properly work with sessions from old system (although doable) will clutter new code.
Database. You will be lucky if database is well designed, but I don't think it will be even close to Laravel Eloquent conventions. Although you can run it on Laravel without any DB schema modifications, your new code will also get bloated in your new app. This and other things can be refactored again in complete system, but it's another load of work.
Considering amount of all possible, not optimal workarounds, in order to have properly designed app (built with best practices), probably it will be better to rebuild from scratch.
Hope it helps a bit...

What is happening when you deploy a BNA file to Hyperledger Composer?

After I've put together my business network definition, what is actually happening on peers after I deploy that package? I'm especially interested in how a hyperledger peer can be interpreting javascript, since that doesn't appear to be a supported language for chaincode.
The Composer chain code is written in Go. It uses the Duktape Javascript interpreter to execute the user (and system) JS code within a Go process.
The Composer chain code maps the public JS API to the underlying Fabric Go API calls.
From a Fabric perspective this is just a "normal" piece of Go chain code, albeit quite a complex one!
When you "deploy" a business network using the Composer CLI, you are actually doing 2 things:
deploying the Composer chain code (Go) and starting it
deploying the bytes of the business network archive and storing it in world-state, so that it is available to the interpreter when you submit transactions
In the future we would like to replace the use of Duktape by native Node.js execution. Thanks to Fabric's modular architecture (and use of Docker containers and gRPC) this should be possible.

Is it possible to use Nodejs Crypto module functionality in Fabric Composer?

I would like to be able to use various functions from the nodejs crypto module in my Fabric Composer chaincode. Is that possible? I tried hashing a simple string within my chaincode but it didn't work - in Playground I received this error message: 'TypeError: Cannot read property 'apply' of null.' and using CLI to local HLF the transaction never completed. I tested my javascript hashing code separately and it works but not when I try to run it within chaincode.
At the moment you cannot use node modules in transaction processor functions- you're limited to base javascript. It's possible that nodejs support will come in the future, and making crypo APIs available in composer is another option being considered.
If you want to try something in the mean time, crypto-browserify might be worth investigating. I don't know if anyone has got that to work though, so please report back if you try it.

What's the traditional way to build a RESTful App via Laravel?

I'm going to build my first REST app via Laravel and a client side framework which I'm not sure yet (probably React or Vue.js).
I did some research about how should I build my app, and unfortunately it got me even more confused.
I've come to a conclusion that I can build my app in 2 ways:
Build the app in the same project. However, without Laravel Blade.
Separate the App to 2 projects (Front and Back).
On the one hand, the pros of building the app on the same project:
Grants me the option to use Laravel Mix.
Laravel offers out of the box Vue support.
On the other hand, the pros of building the app separated from Front to Back:
Each side has its own single responsibility and can be refactored easily.
As I heard it from my friends, it's more convenient (even tho for me it sounds too complex).
I wanted to know what is the most popular way to build a RESTful app when Laravel is being part of it. Is there another way of what I mentioned?
Personally,
I like to keep them apart because it's easier to maintain. Yes, you have to keep track of 2 different projects/folders/repositories but they are pieces of the same cake.
Scaffolding an API in Laravel is very easy and simple. I assume you already know how to do that. You are worried about loosing the advantages offered by Laravel Mix, but believe me you are loosing nothing.
Since your preference is on Angular, just clone any seed project repository with everything setup. e.g:
1. AngularJS: https://github.com/angular/angular-seed
2. Angular 2: https://github.com/mgechev/angular-seed
As you can see, these seed projects already have all the build tools you need and now things seem actually easier. That's what frameworks are made for.
Now imagine later you want to add a mobile app to the stack. You don't even need to change a single thing. Your API already runs independently of the frontend and vice versa.
Question is opinion based... So here is my opinionated answer.
TLDR: For speed of development and arguably more satisfaction, build as one project. Don't overcomplicate unnecessarily too early. When project gets big enough, and starts to generate you some money, then think about splitting the projects - you will know when it is time.
The Laravel Ecosystem is just great for small, medium and even large applications.
Laravel gives you a resources folder, where you can put all your Javascript & front-end assets. You have Envoy to deploy your application and write your deployment scripts. You have mix to build your assets. You don't have to use mix - you could write your own gulp/webpack/grunt etc...
By keeping together as one project, you are able to use the same IDE project for both front-end and backend work, yet keep separation of concerns because all backend code is completely separated from front-end code. You can tweak the payloads being sent from angular, and tweak how the payloads are handled in PHP api nice and easy so you only need 1 ide and one browser and a terminal client.
The nicest thing about keeping the project together, is that assuming you are using VCS (git) and you really should be, then your front and back-end will always be in-sync with each other. Otherwise, you need to manage & coordinate deployments of your front-end and backend code.
When your application gets big enough, it won't take long to separate the projects as the frontend and backend should be already extremely loosely coupled.
Just think of all the added layers of complexity that you are introducing to your application. When you deploy a change to your REST API, you will probably need to also deploy a change to your angular application. What version of the angular app is compatible with what version of the API? If you have a team of devs, working on specific projects, then this complexity pays off - but most teams have processes in place to manage, synchronise & automate deployments.
I think you should go with 2 projects. I would.
I will give an example using complexity rate of growth. It is just from my own experience. (X means amount of features, Y means how complex they are to implement)
With a single project, it is super simple at first. No communication with the server, no hard stuff, everything is tangled. Then nearing the end, it starts to get harder and harder to create more features/pages because everything is tangled.
Or as a function:
But if you start with 2 projects, sure, it will be harder at first (communication, synchronisation, whatever) but the rate of growth for complexity will not be as high. Everything has it's own responsibility, everything does what it needs to do. Tests are simpler, expansion is simpler, refactoring is simpler, and you can complete the project with ease.
Or as a function:
Clearly, from the graphs above, you can infer that the rate of growth for a single project is much slower. (And of course, not actual numbers, I did not measure anything or tracked such projects, this is just out of my own experience)

Resources