Databaseless Laravel +Codeception - turn off database connection completely? - laravel-4

I am building an API calling app which needs no database. In trying to register a new class, I updated my composer and it updated lots of packages including Laravel (now 4.1.25). Now when I run my tests (Codeception) they break as its asking for a database connection. I can't see in the Laravel config options or looking at db connection code how to turn off database completely so it does not look for a connection.

Related

An error occurred handling a request for the Admin UI: Error: Prisma error: The table `main.User` does not exist in the current database

I tried to deploy a keystone app to Heroku and I did it but while I tried to open the app I got the following error:
An error occurred handling a request for the Admin UI: Error: Prisma error: The table main.User does not exist in the current database.
Here's a screenshot containing more details about the error:
I tried to locate the database and create the User table.
I expect to know the steps of how to solve this issue.
It looks like your DB hasn't been initialised properly. The error you've included is failing to count the items in the User list which (if you don't have sessions configured) is likely the first query to run – a count of items in each list is shown on the Admin UI the landing page so that's the first thing it does.
So something about how your migrations are being generated or applied in production isn't setup right. Most of the relevant docs on how this works are in the CLI guide, specifically, see the section about database migrations and the db.useMigrations flag.
Having db.useMigrations turned off can be handy if you're just playing around in dev. Keystone will automatically sync your DB structure to what's defined in your list configs whenever it starts, and does so without creating any physical migration files. If you're prototyping some change or just mucking around, this may be what you want but – if you're deploying somewhere – better to turn db.useMigrations on. Then, if Keystone detects changes to the DB when it runs, it'll prompt you to create a migration file, which can be tweaked to protect existing data if needed, tracked under version control (eg. git) and deployed.
Getting these migrations to run in an environment like Heroku is a little slightly weird as (assuming it's enabled for your app) Heroku can auto-scale. Migrations on the other hand need to be run exactly once. You also can't just lock the DB and run migrations when the first instance of the app starts – this delays the start up of the HTTP server so, if the migrations run for too long, Heroku may think the deployment has failed.
The way we suggest getting around this is to run migrations in the build staging. Fans of the 12-factor app methodology will notice this violated the separation of build and release stages but, for a simple Heroku deploy, it works fine. For larger/more serious apps, creating and applying migrations usually an area that needs significant thought and attention. The specific infrastructure and rollout processes required will be project dependant.
I'd also encourage you to check out the Keystone 6 Heroku example codebase if you haven't already. It's a little out of date but it shows the migrations and package.json scripts in action.

How to migrate Laravel database on Docker onto Google Cloud

My Laravel app is running on docker using Linux commands on a Windows Machine. This means instead of using 'php artisan' commands, I use 'sail artisan' commands. I am able to migrate locally onto the SQL server on docker (sail artisan migrate) and am able to deploy the app onto Google Cloud (gcloud app deploy). The final bit of my puzzle piece is migrating the database onto the Google Cloud SQL server.
When I was first setting the app up, I had problems with this so I exported the SQL server, uploaded it to Google Cloud and then deployed it manually which was fine for a one-off but I now have things I would like to change about the database structure without losing all the data. I also figured it was about time I learnt to do it properly anyway.
I have attempted to use the instructions on Google's community tutorial, however, this guide presupposes the project does not yet exist whereas I am working with a pre-existing application and a pre-existing database. I tried to jump into the tutorial halfway through but I couldn't get the Cloud SQL proxy to work.
After a bit more research, I found this article which I got partway through, however, once I got to the part needing TCP or Unix sockets, neither set of commands will run without error on my Ubuntu terminal.
If anyone knows of any useful articles or has had this problem themself, I would greatly appreciate your help.
Additional Info:
Laravel Framework 8.69.0
Vue version 3.0.5
Docker Engine Community 20.10.8
SQL server 'europe-west2'
I'm not aware of anyway to run sail/artisan command line commands on GCP.
However in ./vendor/facade/ignition/src/Solutions/RunMigrationsSolution.php there is the following function
public function run(array $parameters = [])
{
Artisan::call('migrate');
}
which can be used to programmatically run migrations.
So in ./routes/web.php make a route like
Route::get('/run_migrate',
[DataController::class, 'runMigrate']
)->middleware(['auth', 'verified'])->name('run_migrate');
and in ./app/Http/Controllers/DataController.php add the function
public function runMigrate() {
Artisan::call('migrate');
}
Now after signing in as an authorised user (or if you remove the auth middleware you won't need to sign in) you can go to https://your-app-url.com/run_migrate and it will run any outstanding migrations.
To be able to do this the schema table has to exist so you will have to import your local database to start.

Which is better Redis on Lumen or Laravel?

I just learned about Redis and I want to try create a scalable Web Application, to achieve this I'm going to use Laravel as the main and Lumen as the microservice (API). So after I learned about Redis, I want to add it to my project, but I confused and tried to get a explanation from google, but no luck. I still confused after read a lot of tutorials.
My questions are:
Should I make it separated from the server? (because I saw it on
Docker, redis will be on separated container)
Should I append it to the Laravel? (because it's the main)
Thank you
To connect redis to Laravel see laravel official document
To connect lumen to redis see this links:
lumen doc for cache
lumen doc for queue
You can put your redis in any server you want and connect it to laravel or lumen with (in your .env file):
REDIS_Host="yout server"
REDIS_port="port of your server to connect redis"
REDIS_password="password which set in redis"
NOte: You are not force append redis to laravel if you need it in lumen just!
First of all, Redis is an in-memory data structure that is used as a database, cache and message broker What is Redis. It is similar to the database (DB) you would connect to but not something you can include in your app.
It sits somewhere, running as a daemon and you connect to it for the purposes of caching or message brokering, etc.
Now that you know you cannot append to it, do you want faster caching or session management? do you have the resources to support it? If yes, then you should connect to Redis.
Kindly take notice of something however, if you are going to run both Lumen and Laravel on the same system, you have to make certain changes to both environment files for the two applications.
eg. .env (Laravel app), you can change things like REDIS_HOST to REDIS_HOST_LARAVEL while you maintain it for .env (Lumen app). Another example is DB_HOST to something else like MY_DB_HOST and change them accordingly in the config/ files.
For some reason they can behave weird running to Lumen or Laravel apps on the same server connecting to Redis for cache or session management.

Parse Server Migration, Swapping Client Code

As a server rookie and Parse user, I need to migrate and I intend to migrate to Parse Server, likely with Heroku and mLab.
Once I have clicked Migrate and Finalise in the Parse Dashboard, all data from my original Parse client code goes to the new database, right?
Once migrated, I can just push an update of my client code with the new Parse Server SDK pointing to the new server?
My main over ruling question is do I need to do any management on the client side, sending data to both servers? Or does Parse migration handle this?
I think you are mixing two different things. Read the tutorial
Simply
Step 1
You should move your data from Parse.com to self hosted database (mLab or mongoDB and more...), this step means that api.parse.com will use the "external" database but you will still use the code and server from Parse.com (when you send query to your app it goes to api.parse.com and than it access the database) - do this till end of April 2016
Step2 move from api.parse.com to your own instance of Parse server (the one you download from github or install it on heroku). You will need change you code in your app because it wont use api.parse.com fro mthis point... - till end of July 2016
On github the developers still say that it is not "production ready". You should only migrate your database and build the whole server later. You can read the discussion here

Laravel 5 Openshift Database not connecting

I am setting up a Laravel 5 openshift application but every time i had the database code in project it says whoops something missing. I have added the environment in .env as in my database credential an still no success. I am wondering what may be the cause of this since I followed all instruction the website is working but only if I omit my database code.
Are you trying to get your database working for local or remote development? The .env file in the root directory is for local development, while the .openshift/.env file is for remote development. If you're using an standard OpenShift database (such as MySQL or PostgreSQL), you shouldn't need to make any configuration changes to get the database working. It's already configured via environment variables.

Resources