Laravel Sqlite column index out of range - laravel

When running tests using phpunit I'm getting a not so nice exception
SQLSTATE[HY000]: General error: 25 column index out of range (SQL: select count(*) as aggregate from "attachments" where "id" = f3bad3ad-a888-41bc-b6fd-9a5998f6b527
The Attachment.Id is a UUID and the column is defined as primary key. When running the tests I am using SQLite with an in-memory db.
When switching over to MySQL I do not get an error anymore. I don't really know why the query would cause an error.
Any tips?

1) Make sure the property public $incrementing = false; is set on the model.
2) The migration should be $table->uuid('id')->primary();
3) Add the below function to the model:
public function getKeyType()
{
return 'string';
}
4) To diagnose, create a database.sqlite and allow the test to run on that DB. Diagnose the size of the ID column. Does it fit the UUID string? If it actually doesn't, you may have to update your Laravel app or edit your migrations so if it's running on Sqlite, it will create a string column with the appropriate length.

Related

Out of memory error trying to fetch 1 Eloquent record when eager loading BelongsToMany

I have this query:
Order::find('1234')->refunds()->with('orderItems')->get()
This causes an out of memory error exhausting 256 MB to fetch one row with the loaded items also having only one row.
If I try this, I have no issues:
Order::find('1234')->refunds->first()->orderItems
It is only when I do eager loading.
My pivot definition:
public function items(): BelongsToMany
{
return $this->belongsToMany(OrderItem::class)
->using(OrderItemRefund::class)
->withPivot([
'amount',
]);
}
When I look at the query it is failing on in the trace, I see the following:
SELECT
`order_items`.*,
`order_item_refund`.`refund_id` AS `pivot_refund_id`,
`order_item_refund`.`order_item_id` AS `pivot_order_item_id`,
`order_item_refund`.`amount` AS `pivot_amount`
FROM
`order_items`
INNER JOIN `order_item_refund` ON `order_items`.`id` = `order_item_refund`.`order_item_id`
WHERE
`order_item_refund`.`refund_id` IN ( 0 )
AND `order_items`.`deleted_at` IS NULL
The key problem appears to be how Eloquent is internally constructing the part refund_id IN ( 0 ) where it isn't adding the relevant refund_id value for the pivot table. If I run this as a raw query on the DB it returns all records satisfied by the join without filtering by refund_id which would have produced only a single record.
The bindings for the pivot query are empty when checked:
{
bindings: [],
connectionName: mysql,
executionTimeMs: 8663.32
}
However, all the bindings for the queries Eloquent executes before that until the pivot have the expected values.
I've also further discovered this is impacting all relationships in all models that use BelongsToMany. The code only started acting up after upgrading to Laravel 8 (specifically we're on v8.83.14).
I've tried downgrading both Laravel and Doctrine to earlier minor versions while still remaining on v8 but that didn't help.
In Laravel 6 the primary key is now cast to integer unless explicitly set to string.
https://laravel.com/docs/6.x/upgrade#eloquent-primary-key-type
The reason this happened is because we upgraded from Laravel 5.4 to Laravel 8 and have primary keys that are string based.
Setting the following on relevant models fixed the issue:
protected $keyType = 'string';

Retrieve related table and select fields from primary table using Laravel Artisan

In the following code, The Users table has a related table phoneNumbers. When I retrieve a list of all users like this,
return Person::with('phoneNumbers')->get();
everything works fine. However, when I attempt to specify a list of columns to return from the Person table, the phone_number returns empty.
return Person::with('phoneNumbers')
->get(['fname','lname', 'email']);
If I add the number field or phone_number.number to the get array, then I get an error as an undefined column. What is the laravel way of handling this.
Try this:
return Person::select(['your_foreign_key', 'fname','lname', 'email'])
->with('phoneNumbers')get();

Laravel HasManyThrough returns empty collection

I am attempting to use HasManyThrough (which I think might be incorrect) to return the value of a column from a table via an intermediate table. From the File model I am attempting to access the Game table. The relations are:
A game can have many mods.
Many mods can have many files.
My table schema:
File Table
id - primary key
mod_id - foreign key
Mod Table
id - primary key
game_id - foreign key
Game Table
id - primary key
And this is how I am attempting to link them, I think the issue is there is only one Game for all the files I am attempting to find however I am not sure what the singular equivalent of HasManyThrough is?
In files.php the method I am attempting to use to map the relation is the following:
public function game()
{
return $this->hasManyThrough(
'App\Game',
'App\Mod',
'id',
'id',
'mod_id',
'game_id'
);
}
And I am attempting to get it via the following call:
$data = App\Files::with('game')->get();
Funny enough if I dump & die right after calling the above method it returns the game objects, but as soon as I attempt to eager load the relation, it returns an empty collection.
Any help would be greatly appreciated.

Laravel test preserving records in SQLite while rolling back migrations?

I am working on a Laravel 5.2 app. I am trying to set up testing using an SQLite in-memory database.
I'm aware of the SQLite problems related to changing columns that are NOT NULL without specifying a default, (for example described in this question). An apparent workaround is to make the field first nullable, and then immediately undo that, in a separate transaction. So I've spent many hours going over all my migrations and doing things like:
Schema::table('customers', function (Blueprint $table) {
$table->string('first', 128)->after('id')->nullable();
});
Schema::table('customers', function (Blueprint $table) {
$table->string('first', 128)->after('id')->nullable(false)->change();
});
I'm also aware of the SQLite problem where you can't drop more than 1 column per transaction (eg described here). I spent many more hours updating my migrations to do that.
All migrations now successfully run and rollback without errors using SQLite and :memory:; they also run fine in MySQL.
So I added a test, using migrations. The first and only thing the test does is use a Factory to create a customer:
<?php
namespace Tests\Unit;
use TestCase;
use Illuminate\Foundation\Testing\DatabaseMigrations;
class PromoTest extends TestCase
{
use DatabaseMigrations;
/** #test */
public function dummy_test() {
factory('App\Customer')->create();
print_r(\DB::select('select * from customers'));
}
}
The output of the query right after creating the customer shows that it worked fine (fake data comes from Faker):
$ ./vendor/bin/phpunit --filter PromoTest
PHPUnit 4.8.24 by Sebastian Bergmann and contributors.
EArray
(
[0] => stdClass Object
(
[id] => 1
[email] => nRyan#example.net
[zipcode] => 47049-6020
[subscribe] => 0
[created_at] => 2017-08-25 15:47:28
[updated_at] => 2017-08-25 15:47:28
[first] => Syble
[last] => Hahn
[phone] => (761)073-7794x8624
[invalid_email] =>
)
)
But immediately after that output, the test crashes and throws the following error:
SQLSTATE[23000]: Integrity constraint violation: 19 NOT NULL constraint failed: customers.name (SQL: INSERT INTO customers (id, email, zipcode, subscribe, created_at, updated_at, name) SELECT id, email, zipcode, subscribe, created_at, updated_at, name FROM __temp__customers)
Notice the name field causing the problem does not appear in the created record above. The name field did exist in this table in the past, but not any more - I removed it (with a migration) and now use first and last. And clearly that migration worked, as the record created, inserted and dumped out above has the right, current fields.
So why is Laravel trying to insert records into an older version of the table, after it has already successfully created a record in the current one?
I am only guessing, but it looks like after the test is done, and the migrations are being rolled back, the data is being preserved and copied or re-inserted into each historical iteration of the schema? There's no way that can work though ... ?
Has anyone seen this? Is it expected or am I doing something wrong?
UPDATE The problem goes away if I simply switch to using MySQL for the tests. Looks like I'll have to do that for now, as I'm stumped. I'd much rather use SQLite in memory as it is faster.

EF6 Oracle default value for a column in code first migration

I am trying to write a migration that sets Default Value for a NUMBER column in my Oracle database using EntityFramework 6. Here is my first try that didn't set the default value:
public override void Up()
{
AddColumn("MTA.PLAN_SHEETS", "QUANTITY_CHANGED", c => c.Decimal(nullable: true, precision: 3, scale: 0, defaultValueSql: "1"));
}
I also tried to use defaultValue instead of defaultValueSql and again it didn't set the default value for the column.
Is there anything wrong with my migration code?
I came across the same problem working with Oracle and EF6. It would appear the Oracle provider does not support it. There is a work around though, in case you haven't already found one.
You first need to set the QuantityChanged property as nullable in your model (or Fluent API, wherever you are handling this). Then you can run the add-migration command which will generate a migration file with the 'AddColumn' method in the 'Up' method. After that, add an explicit SQL command to update all values to the default value needed. If you need the column to be NOT NULL moving forward you will need another SQL command to modify the column and set it to NOT NULL.
public override void Up()
{
AddColumn("MTA.PLAN_SHEETS", "QUANTITY_CHANGED", c => c.Decimal(precision: 3, scale: 0));
Sql("UPDATE MTA.PLAN_SHEETS SET QUANTITY_CHANGED = 1");
Sql("ALTER TABLE MTA.PLAN_SHEETS MODIFY QUANTITY_CHANGED NOT NULL");
}
I hope this helps. Reference my question if needed: How do I set a default value for a new column using EF6 migrations?

Resources