My database structure looks like this:
User
id - integer
user_user_group
user_id - integer
user_group_id - integer
meeting_user_group
user_group_id - integer
meeting_id - integer
meeting_mails
meeting_id - integer
sent - boolean
date - date
meetings
id - integer
I have a user from whom I need to get the meeting_mails where sent == false and date == today.
Is there a smart way to do this in Laravel, or do I have to write an ugly SQL query by hand?
For performance reasons, I would like to make as little as possible with collections or arrays and let the database do as much as possible.
I'm using Laravel version 5.8
I am using this package now.It is an easy way to build such relationships.
Related
I have come across problems with Laravel relations when couple of model ids are identical but another has leading zero and the another has not.
Product ID | Productname
-----------|------------
012345 | Product A
12345 | Product B
If those relations are loaded in the same query, only the first one will be returned and the other will not.
The database columns are strings and in the Product model I have been set the incrementing to false and cast of id attribute to string. Doesn't Laravel's eager loading take leading zeros into account?
I'm not able to change those product ids with leading zeros.
Thanks in advance!
I just run a test on a local laravel installation and I can't reproduce the problem.
Are you sure it is not a problem of the way you structure a query? Because if at any time during your program the $id variable is treated as an integer, the future conversion to string will remove the leading zero.
For example, maybe you get the id from the request:
$productId = $request->get('product_id');
At this point $productId is considered an integer, so if you use productId to query your DB, the leading zero will be removed.
You need to be sure that during the lifecycle of your request that variable is never converted to integer.
You can test the proper query using tinker, and obtaining your products manually:
Product::find('012345');
Product::find('12345');
I've got 2 classes
Reports - objectID, Title, Date & relationItem ( Relation type column linked up to Items)
Items - ObjectID, Title, Date etc
I want to query all the Items that are equal to a objectID in reports. Users create reports then add items to them. These items are found in the Items table.
I've looked at the https://parseplatform.github.io/docs/ios/guide/#relations but don't see anything for swift3.
I've tried a few things with little success. This snipplet below i did find, but not sure how to apply it to my classes.
var relation = currentUser.relationForKey("product")
relation.query()?.findObjectsInBackgroundWithBlock({
Would love somebody to direct me into the right direction! Thanks
Tried this code below too!
var query = PFQuery(className:"Items")
query.whereKey("relationItem ", equalTo: PFObject(withoutDataWithClassName:"Reports", objectId:"MZmMHtobwQ"))
Ok so i had to change the table slightly to get this to work to prevent a query within a query.
I've added a relation Type to the Items table instead of the Reports Table
Then i managed to retrieve all the Items based of that report ObjectId like this:
let query = PFQuery(className:"Items")
query.whereKey("reportRelation", equalTo: PFObject(withoutDataWithClassName:"Reports", objectId:"3lWMYwWNEj"))
This then worked. Note that reportRelation is the Relational Type Column.
Thanks
When you’re thinking about one-to-many relationships and whether to implement Pointers or Arrays, there are several factors to consider. First, how many objects are involved in this relationship? If the “many” side of the relationship could contain a very large number (greater than 100 or so) of objects, then you have to use Pointers. If the number of objects is small (fewer than 100 or so), then Arrays may be more convenient, especially if you typically need to get all of the related objects (the “many” in the “one-to-many relationship”) at the same time as the parent object.
http://parseplatform.github.io/docs/ios/guide/#relations
If you are working with one to many relation, use pointer or array. See the guide for examples and more explanation.
I am using InfluxDB and want to write Epoch time values in user define column as shown below in v1 field.
cpu_load,host=server01,core=0 value=0.45,v1=1437171724
cpu_load,host=server01,core=0 value=0.45,v1=1437171725
Now, how can i query this column just like i can query regular time based column
select * from cpu_load where v1 > '2016-08-31 00:42:24.000'
This query is not working, however if i switch v1 with time column it works just fine.
select * from cpu_load where time > '2016-08-31 00:42:24.000'
Wondering how can i use user define time/column value in InfluxDB?
InfluxDB only supports field types of string, integer, float, and boolean.
The time column is a special case. Even though it is stored as an integer under the hood, only the time field can be filtered with time based constraints.
There is a longstanding feature request to allow comparison of fields to time.
In the meantime, the raw integer Epoch time value can be used to set a constraint on the v1 field. E.g.
select * from "cpu_load" where "v1" > 1472604144000
I have two columns in a database once called purchase_date and the other called lifetime. I am trying to filter the results so that only the assets that are not passed their lifetime are shown when given the current tax year.
I've tried
Asset.where("? < DATE_ADD(purchase_date,INTERVAL lifetime YEAR) AND purchase_date >= ?",Date.new(2012,1,1),Date.new(2012,1,1))
But, then I realized that this would be tied to MySQL if it did work, which is fine if this was just going to work with my full aplication. But, I want something that is database agnostic especially since this is being tested against Sqlite. Sqlite with this query gives me the following exception:
ActiveRecord::StatementInvalid: SQLite3::SQLException: near lifetime
How can I add an integer column to a date column in my sql statement in an database agnostic way?
Update
I have played around with Squeel, but I don't think its going to work. It may be possible to detect the adapter and base it off of that.
Why not add an :expires_at field? You can set a :before_save hook to make sure it's automatically kept in sync with your :lifetime field.
before_save :set_expiration_date
scope :not_expired, where("expires_at > ?", DateTime.now)
def set_expiration_date
return unless purchase_date.changed? || lifetime.changed?
self.expires_at = purchase_date + lifetime.years
end
Current application use JPA to auto generate table/entity id. Now a requirement wants to get a query to manually insert data in to the database using SQL queries
So the questions are:
Is it worth to create a sequence in this schema just for this little requirement?
If answer to 1 is no, then what could be a plan b?
Yes. A sequence is trivial - why would you not do it?
N/A
Few ways:
Use a UUID. UUIDs are pseudo-random, large alphanumeric strings which are guaranteed to be unique once generated.
Does the data have something unique? Like a timestamp, or IP address, etc? If so, use that
Combination of current timestamp + some less unique value in the data
Combination of current timestamp + some integer i that you keep incrementing
There are others (including generating a checksum, custom random numbers instead of UUIDs, etc) - but those have the possibility of overlaps, so not mentioning them.
Edit: Minor clarifications
Are you just doing a single data load into an empty table, and there are no other users concurrently inserting data? If so, you can just use ROWNUM to generate the IDs starting from 1, e.g.
INSERT INTO mytable
SELECT ROWNUM AS ID
,etc AS etc
FROM ...