I want to fetch all tables from the big query dataset,
but it returns only 50 tables from my dataset, here's my code
#dataset_active_tbls.tables returns only 50 tables
#bigquery = Google::Cloud::Bigquery.new({ :credentials => "../commons/bigquery_cred.json" })
#dataset_active_tbls = #bigquery.dataset "my_dataset"
#dataset_active_tbls.tables
how do I get this
I have tried
#dataset_active_tbls.tables.all
fetches all the tables and metadata for that table, it solved my problem
Related
I am trying to group products with deleted and non deleted items, but code does not work(also not crushes), tried to group by categories but still does not work... any idea?
const sales = await getRepository(Sales)
.createQueryBuilder('sales')
.where("sales.selfMerchantId = :id", {id: user.id})
.groupBy("sales.deleted_at")
.addGroupBy("sales.id")
.withDeleted()
.getMany();
return sales;
Two group by attributes mentioned in the query sales.deleted_at and sales.id makes each attribute of the result set unique. I would rather write two different queries with where clause to get deleted and non deleted records.
Update 1
I verified that ORM is returning an expected response with the above query. In the response, I am getting non deleted records first and then deleted records.
I have a problem with a query in Laravel with many records, because it is so slow.
I have a table users that has 4934 records.
I have table of relations that for example is named user_relation_values that has 17482 records.
I have a table values that has 20495 records.
Now I receive,from front end, N id of users, so I could receive 1 id or 4934 ids or 2000 ids or 1000 ids, so I don't know how many ids I will receive.
I have to return, starting by ids received, the relations of model users with id.
So in My backend I have a function like this:
$users= $request->input('users');
$usersValues = array();
foreach ($users as $user){
$o = User::find($owner['id']);
$o->values;
$owersProperties[] = $o;
}
I have the relations in My User model:
public function values(){
return $this->belongsToMany('App\Models\Values');
}
I have put The various indexes on tables but the query Is so slow if I receive the all id.
If I receive all 4934 ids the query takes more than 20 seconds, but I read that someone make query for milions of records in just over 5 seconds.
How can I optimize my table or my query?
Looking at your example code, you’re query users in a loop (n queries), and then also querying values for each user. That’s going to issue an exponential number of queries. Instead, you should query the users and eager-load the values relationship.
If you’re getting an array of user IDs as input, then you can pass that to Eloquent’s find() method:
$users = User::find($request->input('users'));
You should also eager-load the values relationship if you plan on using it in a loop:
$users = User::with('values')->find($request->input('users'));
This should dramatically reduce the number of queries you issue.
I need to get the count of all the rows in the pivot user_engagement. I have defined a many to many relationship for users and engagements. I am getting the number with a raw query at the moment, but would like to change this query in to an eloquent one if possible:
$numberOfEngagements = DB::table('user_engagement')->count();
Create model for your many to many relationship table and try this
$count = App\UserEngagement::all()->count();
Faster and memory friendly:
$count = App\UserEngagement::count();
Actually I want to get the full table but it should be based on Doc_Type==distinct
Means it should only pick the records from table that has unique Doc_Type. I have tried with following but it returns me a single column into tolist() but I want to get full table. How can I do it?
var data = DB.tblDocumentTypes.Select(m => m.Doc_Type).Distinct().ToList();
You can use GroupBy
var data = DB.tblDocumentTypes.GroupBy(m => m.Doc_Type).Select(x => x.First());
I need to insert to two tables in a single query. Is this possible to do in LINQ?
At present I am using insertonsubmit() 2 times.
If your tables have a primary key/foreign key relationship to each other, then you also have two objects which you can link to each other:
InternetStoreDataContext db = new InternetStoreDataContext();
Category c = new Category();
c.name = "Accessories";
Product p = new Product();
p.name = "USB Mouse";
c.Products.Add(p);
//and finally
db.Categories.Add(c);
db.SubmitChanges();
That adds your object and all linked objects when submitting the changes.
Note that for that to work, you must have a primary key in both tables. Otherwise LINQ doesn't offer you the linking possibility.
Here are good examples of using LINQ to SQL: http://weblogs.asp.net/scottgu/archive/2007/05/19/using-linq-to-sql-part-1.aspx
The database submit doesn't happen until you call SubmitChanges. There is no tangible cost associated with multiple calls to InsertOnSubmit - so why not just do that?
This will still result in two TSQL INSERT commands - it simply isn't possible to insert into two tables in a single regular INSERT command.