I am using Learning Locker (Learning Record Store).
I succeed inserting statements to it via the REST API.
But I did not succeed fetching statements from it.
How do I preform a query on statements? REST API?
I used tinCanPhp library. This is how you establish a connection with the Learning Locker database and query it in PHP. For example:
$lrs = new TinCan\RemoteLRS(
'endpoint/public/data/xAPI/',
'1.0.1',
'username',
'key'
);
$actor = new TinCan\Agent(
[ 'mbox' => 'mailto:dikla#gmail.com' ]
);
$verb = new TinCan\Verb(
[ 'id' => 'http://adlnet.gov/expapi/verbs/progressed' ]
);
$activity = new TinCan\Activity(
[ 'id' => 'http://game.t-shirt' ]
);
$statement = new TinCan\Statement(
[
'actor' => $actor,
'verb' => $verb,
'object' => $activity,
]
);
//get All Actor activity by his unique id
function getAllActorActivity($actorUri){
global $lrs;
$actor = new TinCan\Agent(
[ 'mbox' => $actorUri ]//actorUri should look like this 'mailto:dikla#gmail.com'
);
$answer=$lrs->queryStatements(['agent' => $actor]);
return $answer;
}
If it's via javascript you can use the ADL xAPI Wrapper. It simplifies communication with an LRS... https://github.com/adlnet/xAPIWrapper#get-statements
In general you do a GET request on endpoint /statements... try just that first and see if you get a json response with a "statements" and a "more" property. Then if that works, you can narrow down results with filters. See the spec for the details and options. https://github.com/adlnet/xAPI-Spec/blob/master/xAPI.md#stmtapiget
try that curl command.. it should return a statement result albeit from the ADL LRS...
curl --user tom:1234 GET https://lrs.adlnet.gov/xapi/statements
Related
I use GuzzleHttp to send data via "_bulk" to an Elastic Search index. It is only a small dataset of 850 records. When I transfer the data record by record, I get an error message for 17 records. That's fine for me, so I can fix the errors.
But when I use _bulk, I do not get any error message at all. The 17 incorrect records are just ignored and are missing inside the index. How can I get an error message here? Are there some kind of options that I can use? Any ideas?
The endpoint is:
Here are my main code parts:
$jsonData = "xxxxx"; // the payload for the request
$elasticUrl = "https://xxxx.xx/xxxxx/_doc/_bulk";
$client = new Client([
"verify" => false, // disable ssl certificate verification
"timeout" => 600, // maximum timeout for requests
"http_errors" => false // disable exceptions
]);
$header = ["Content-Type" => "application/json"];
$result = $client->post($elasticUrl,
[
"headers" => $header,
"body" => $jsonData
]
);
if ($result->getStatusCode() != 200) {
$ret = "Error ".$result->getStatusCode()." with message: ".$result->getReasonPhrase();
}
A bulk request will always succeed with HTTP 200.
However, in the bulk response, you should see an indication whether each item succeeded or not. If you see errors: true in the response, then you know some of the items could not get indexed and looking into the items array, you'll find the error for the corresponding items.
As #Val pointed out the use of $response->getBody() gives the needed information:
$body = (string) $result->getBody();
$bodyArray = json_decode($body, true);
if ($bodyArray["errors"]) {
$retArray = [];
foreach ($bodyArray["items"] as $key => $item) {
if (isset($item["create"]["error"])) {
$retArray[] = $item["create"]["error"]["reason"].": ".json_encode($data[$key]);
}
}
$ret = implode(", ", $retArray);
}
As side note: in $data I keep the data as php array before sending it to Elastic Search.
Is it possible to get data from new Google Analytics (GA4) accounts through API V4? It always returns the following error message:
{ "error": { "code": 403, "message": "User does not have sufficient permissions for this profile.", "errors": [ { "message": "User does not have sufficient permissions for this profile.", "domain": "global", "reason": "forbidden" } ], "status": "PERMISSION_DENIED" } }
I can do it perfectly on UA accounts.
Is there any API (web server request - OAuth) specific to this new account type?
property id
Here is the code used (PHP):
require_once __DIR__ . '/vendor/autoload.php';
session_start();
$client = new Google_Client();
$client->setAuthConfig(__DIR__ . '/FILE.json');
$client->addScope(Google_Service_Analytics::ANALYTICS_READONLY);
$client->setAccessToken($_SESSION['access_token']);
$analytics = new Google_Service_AnalyticsReporting($client);
$response = getReport($analytics);
printResults($response);
function getReport($analytics){
$dateRange = new Google_Service_AnalyticsReporting_DateRange();
$dateRange->setStartDate("7daysAgo");
$dateRange->setEndDate("today");
$sessions = new Google_Service_AnalyticsReporting_Metric();
$sessions->setExpression("name");
$sessions->setAlias("sessions");
$request = new Google_Service_AnalyticsReporting_ReportRequest();
$request->setViewId('307566943');
$request->setDateRanges($dateRange);
$request->setMetrics(array($sessions));
$body = new Google_Service_AnalyticsReporting_GetReportsRequest();
$body->setReportRequests( array( $request) );
return $analytics->reports->batchGet( $body );
}
User does not have sufficient permissions for this profile
Means that the user you have authenticated your application with. Does not have permission to access the Google analytics view you are trying to extract data from.
The issue can also be caused if you are trying to use the Google analytics reporting api with a Google analytics GA4 account. As GA4 property id are not the same as UA view ids. The system gets confused and assumes you just dont access.
The solution is to authenticate the app with a user that has access to that view or grant the user access. And to check that you are using the correct api for the type of google analytics you are trying to access.
UA vs GA4
Also remember that to extract date from a GA4 account you need to use the Google analytics data api. If you have extracted data from UA accounts you have been using the Google analytics reporting api. These are two completely diffrent APIs with diffrent methods.
Google analytics data api quick start
require 'vendor/autoload.php';
use Google\Analytics\Data\V1beta\BetaAnalyticsDataClient;
use Google\Analytics\Data\V1beta\DateRange;
use Google\Analytics\Data\V1beta\Dimension;
use Google\Analytics\Data\V1beta\Metric;
/**
* TODO(developer): Replace this variable with your Google Analytics 4
* property ID before running the sample.
*/
$property_id = 'YOUR-GA4-PROPERTY-ID';
// Using a default constructor instructs the client to use the credentials
// specified in GOOGLE_APPLICATION_CREDENTIALS environment variable.
$client = new BetaAnalyticsDataClient();
// Make an API call.
$response = $client->runReport([
'property' => 'properties/' . $property_id,
'dateRanges' => [
new DateRange([
'start_date' => '2020-03-31',
'end_date' => 'today',
]),
],
'dimensions' => [new Dimension(
[
'name' => 'city',
]
),
],
'metrics' => [new Metric(
[
'name' => 'activeUsers',
]
)
]
]);
// Print results of an API call.
print 'Report result: ' . PHP_EOL;
foreach ($response->getRows() as $row) {
print $row->getDimensionValues()[0]->getValue()
. ' ' . $row->getMetricValues()[0]->getValue() . PHP_EOL;
}
I am trying to invoke a lambda function from AppSync passing a search query. The lambda is calling elastic search which returns the result set.
I am able to map the result set to different fields in the graphQL Schama
$#set($result = {
"statusCode": "${context.result.statusCode}",
"headers": "${context.result.headers}",
"isBase64Encoded": "${context.result.isBase64Encoded}",
"body": "${context.result.body}"
})
$util.toJson($result)
In body get the search result set which I need to then parse and map them to a Schama.
I am unable to extract the response ${context.result.body.hits.hits} to iterate through the _source and set the search result set
Any suggestion and guidance will very helpful.
AppSync has built in support for Amazon Elasticsearch resolvers. You can find some more information about that here!
However, if you wish to keep your current Lambda resolver you could try the following mapping template:
## Delcare an empty array
#set( $result = [] )
## Loop through results
#foreach($entry in $context.result.hits.hits)
## Add each item to the result array
$util.qr($result.add(
{
'id' : $entry.get("_source")['id'],
'title' : $entry.get("_source")['fields']['title'],
'plot' : $entry.get("_source")['fields']['plot'],
'year' : $entry.get("_source")['fields']['year'],
'url' : $entry.get("_source")['fields']['image_url']
}))
#end
## Parse the result
$util.toJson($result)
The issue was resolved by converting the context results body as below. Once this was done , I was able to iterate the resultSet
[
#set($result = $context.result)
## resultSet - parse back to JSON
#set($result.resultSet = $util.parseJson($context.result.body))
#foreach($entry in $result.resultSet.hits.hits)
## $velocityCount starts at 1 and increments with the #foreach loop **
#if( $velocityCount > 1 ) , #end
$util.toJson(
{
'id' : $entry.get("_source")['id'],
'title' : $entry.get("_source")['fields']['title'],
'plot' : $entry.get("_source")['fields']['plot'],
'year' : $entry.get("_source")['fields']['year'],
'url' : $entry.get("_source")['fields']['image_url']
}
)
#end
]
I am creating a query application for users to filter several tables down and then download the results as CSV files, doing this with small return results has proven very easy, but several of the return datasets will be up and above 300k+ rows of data.
Time out errors were being thrown, so I need a new approach. Application is written in laravel.
I was able to run a raw query and created a csv with 380k rows of data, but the --secure-file-priv forced me to put the file into a specific place. I need the download file to be accessible to the user who filtered the data down.
my current 3 approaches as follows:
// $performance = DB::select("SELECT * from performance_datas INTO OUTFILE '/var/lib/mysql-files/performance_data2.csv' FIELDS ENCLOSED BY '\"' TERMINATED BY ';' ESCAPED BY '\"' LINES TERMINATED BY '\r\n' ;");
This raw query created the intended file, but I don't know how to make this accessible to the user to download.
Approach #2:
$headers = array(
'Content-Type' => 'text/csv',
'Cache-Control' => 'must-revalidate, post-check=0, pre-check=0',
'Content-Disposition' => 'attachment; filename=performances.csv',
'Expires' => '0',
'Pragma' => 'public',
);
$response = new StreamedResponse(function() use($filename){
// Open output stream
$handle = fopen($filename, 'w');
// Add CSV headers
fputcsv($handle, [
"id", "ref", "DataSet", "PubID", "TrialID", "TrtID", "SubjectID", "Site_Sample", "Day_Sample",
"Time_Sample", "VarName", "VarValue", "VarUnits", "N", "SEM", "SED", "VarType"
]);
PerformanceData::all()
->chunk(1500, function($datas) use($handle) {
foreach ($datas as $data) {
// Add a new row with data
fputcsv($handle, [
// put data in file
]);
}
});
// Close the output stream
fclose($handle);
}, 200, $headers);
This approach timed out. I used Laravel to get ::all() in this case, this would be the largest data set for this particular table.
Approach 3 was just different variations of approach 2, with the same results of timing out.
I need the user to be able to create a csv and download it after its ready, or create and download it directly.
Open to any suggestions!
I want to send a Nest delete request to elasticsearch without specifying the object which I don't have. I've seen solutions like:
var response = elasticClient.DeleteByQuery<MyClass>(q => q
.Match(m => m.OnField(f => f.Guid).Equals(someObject.Guid))
);
From: DeleteByQuery using NEST and ElasticSearch
As I'm just reading plain text from a queue I don't have access to the MyClass object to use with the delete request. Basically I just want to delete all documents in an index (whose name I know) where a variable matches for example ordId = 1234. Something like:
var response = client.DeleteByQuery<string>( q => q
.Index(indexName)
.AllTypes()
.Routing(route)
.Query(rq => rq
.Term("orgId", "1234"))
);
I see that the nest IElasticClient interface does have a DeleteByQuery method that doesn't require the mapping object but just not sure how to implement it.
You can just specify object as the document type T for DeleteByQuery<T> - just be sure to explicitly provide the index name and type name to target in this case. T is used to provide strongly type access within the body of the request only. For example,
var client = new ElasticClient();
var deleteByQueryResponse = client.DeleteByQuery<object>(d => d
.Index("index-name")
.Type("type-name")
.Query(q => q
.Term("orgId", "1234")
)
);
Will generate the following query
POST http://localhost:9200/index-name/type-name/_delete_by_query
{
"query": {
"term": {
"orgId": {
"value": "1234"
}
}
}
}
Replace _delete_by_query with _search in the URI first, to ensure you're targeting the expected documents :)