Config::set and Artisan::call - laravel

Every time I run this command below, it runs on the default database, note the database I've selected:
Config::set('database.connections.mysql.database', 'somedatabasename');
Artisan::call('migrate');
Anyone know what this is not working?

You could implement that by using different environments. For example, one config for testing environment, another for local / staging / production. Could you elaborate on what you're actually trying to achieve and what's the context so we can answer in more depth?

Related

could not init postgres with a dump file

just starting testcontainers. I love the idea. thanks for investing in this project.
I am trying to create a simple postgres 14.5 container (and susceeded) and now I am trying to populate it using the .withInitScript() method.
the file I am feeding into the init method is a dump I created with pg_dumpall.
testcontainers fails for many parsing/validation reasons. each time I delete a portion and another reason pops up.
should I be able to succesfully use the withInitScript with pg_dump files?
BTW, using pg_dump for my main DB also has many similar issues.
thanks!
Try copying the script to the container so postgres will execute. Although this comment BTW, using pg_dump for my main DB also has many similar issues. makes me wonder if it will work because it also fails when you are using the database directly if I understood correctly.
new PostgreSQLContainer("postgres:14.5")
.withCopyFileToContainer(
MountableFile.forClasspathResource("init.sql"),
"/docker-entrypoint-initdb.d/init.sql"
);
We recommend to use liquibase or flyway to manage database changes.
hi and thanks for the help
I have managed to make things work by stripping some things from the sql dump and using the copyFileToContainer
thanks

Why is there no data inside of my database when step debugging my Laravel Sail application?

Background
I am using PhpStorm to debug a test that creates a database entry. My goal is to set a breakpoint, then inspect the database manually.
I have confirmed this so far:
Step debugging is properly configured
Can connect via a forwarding port set up in docker-compose.yml (Fig. 1)
Laravel reports the entry exists in the database (Fig. 2)
Relevant Code
The star below indicates my breakpoint.
...
use Illuminate\Foundation\Testing\LazilyRefreshDatabase;
...
class ObfuscatedTestClass extends TestCase
{
use LazilyRefreshDatabase;
...
/** #test */
public function obfuscated_test_name() {
Queue::fake();
ObfuscatedModelName::factory()->create();
* Queue::assertPushed(SyncLeaseWithAccountingApp::class);
}
Hypotheses
Maybe I'm misunderstanding how databases are handled during these tests. I know Laravel has the ability to use database transactions to speed up tests, but I expect it to be modifying the database here when I'm using LazilyRefreshDatabase. Why else would I need to set up a database for testing?
Figures
Figure 1
Figure 2
I've been in the same boat all day, I also suspected transactions as a likely cause.
Then I came to realize all that was needed is to refresh the database connection when stopped at a breakpoint.
I'm not sure if this will be helpful to anyone but I thought I should answer because my situation is so similar to OP's - also running Docker and PhpStorm.
We're typically not in the habit of refreshing the db connection on a regular basis so it's easy to overlook.
EDIT:
To tag a little bit more onto my answer from earlier.. The data will persist if your test does not wipe it which can be useful. You can then tinker with that data by creating a file named ".env.testing" & setting APP_ENV=testing. Then run tinker like so: php artisan tinker --env=testing

bash script invoked in freeradius

Can you please help me insert my bash script into freeradius. I would like to start my script each time a user is allowed access via freeradius to my network.
I tried to insert my script into queries (/etc/freeradius/3.0/mods-config/sql/main/mysql/queries.conf), but the script is not invoked.
If you have any idea on how to do this please let me know.
Thank you in advance!
Adding random things to the SQL configuration isn't going to help here.
You need to configure the exec module, the best example is in mods-enabled/echo (though also see mods-enabled/exec). There are examples in that file on how to point to the script that you want to run, and what it should return.
Then to ensure that it is run after a successful authentication, make sure that echo (or whatever instance name you gave to the module configuration) is listed in the post-auth{} section of the correct virtual server, most likely sites-enabled/default.
Note that calling out to external scripts is nearly always a bad idea, it will cause performance to drop significantly. There is usually a better way to solve the problem.

Serverless Detect Running Locally

I am running a command like the following.
serverless invoke local --function twilio_incoming_call
When I run locally in my code I plan to detect this and instead of looking for POST variables look for a MOCK file I'll be giving it.
I don't know how to detect if I'm running serverless with this local command however.
How do you do this?
I looked around on the serverless website and could find lots of info about running in local but not detecting if you were in local.
I found out the answer. process.env.IS_LOCAL will detect if you are running locally. Missed this on their website somehow...
If you're using AWS Lambda, it has some built-in environment variables. In the absence of those variables, then you can conclude that your function is running locally.
https://docs.aws.amazon.com/lambda/latest/dg/lambda-environment-variables.html
const isRunningLocally = !process.env.AWS_EXECUTION_ENV
This method works regardless of the framework you use whether you are using serverless, Apex UP, AWS SAM, etc.
You can also check what is in process.argv:
process.argv[1] will equal '/usr/local/bin/sls'
process.argv[2] will equal 'invoke'
process.argv[3] will equal 'local'

Embedded MongoDB instance?

I've been using MongoDB for a little tool that I'm building, but I have two problems that I don't know if I can "solve". Those problems are mainly related with having to start a MongoDB server (mongod).
The first is that I have to run two commands every time that I want to use it (mongod and my app's command) and the other is testing. For now, I'm using different collections for "production" and "test", but it would be better to have just an embedded / self-contained instance that I can start and drop whenever I want.
Is that possible? Or should I just use something else, like SQLite for that?
Thanks!
Another similar project is https://github.com/Softmotions/ejdb.
The query syntax is similar to mongodb.
We use this at work - https://github.com/flapdoodle-oss/embedmongo.flapdoodle.de - to fire up embedded Mongo for integration tests. Has worked really well.
I haven't tried it, but I just found this Ruby implementation of an embedded MongoDB: https://github.com/gdb/embedded-mongo

Resources