Registering Spark API Token - laravel

I have created a Spark API Token and am trying to register spark token using this command
spark register token-value
but I am getting an error
Cannot open source file register.ada
Does anyone have an idea on what might be causing this error on ubuntu O.S?

You need to buy Spark to get the api token. When you have it, replace token-value with your api key.

This solved it for me which was required removal of Adacore Spark https://laracasts.com/discuss/channels/spark/cant-find-registerada-when-tried-to-register-spark-api-token

Related

Access to Nifi flow using Nipy Api and LDAP

I'm trying to obtain access to nifi flow project, through nipyapi and LDAP.
I have nify flow and registry up and running, and login/password('login'/'password')
import nipyapi
nipyapi.config.nifi_config.host = 'https://nifiexample.com/nifi'
nipyapi.config.registry_config.host = 'https://nifiexample.com/nifi-registry'
print(nipyapi.canvas.get_root_pg_id())
I read docs and found this method
nipyapi.security.set_service_ssl_context(service='nifi', ca_file=None, client_cert_file=None, client_key_file=None, client_key_password=None)
but as far as I'm not a developer I don't understand how to use it properly.
Can please someone tell me, what else configs/proprieties should I add to run this simple script?
I would recommend using the Secured Connection Demo from the docs. The Python code goes through this process step-by-step.
Understanding how NiFi uses TLS and performs authentication and authorization will also help these steps make sense.

Unable to Create Common Data Service DB in Default Environment Power Apps

I am unable to create a new Common Data Service Database in my Power Apps default environment. Please see the error text below.
It looks like you don't have permission to use the Common Data Service
in this environment. Switch to a different environment, or create your
own.
Which as I understand I should be able to create after the Microsoft Business Application October 2018 update as listed in the article available at following link.
https://community.dynamics.com/365/b/dynamicscitizendeveloper/archive/2018/10/17/demystifying-dynamics-365-and-powerapps-environments-part-1
Also when I try to create a Common Data Service app in my default environment, I encounter following error.
The data did not load correctly. Please try again.
The environment 'Default-57e1485d-1197-4afd-b792-5c423ab508d9' is not
linked to a new CDS 2.0 instance. The operation 'ListInstanceMetadata'
is forbidden for unlinked environments
Moreover I am unable to see the default environment on https://admin.powerapps.com/environments, I can only see the Sandbox environment there.
Any ideas what I am missing here?
Thank you.
Someone else faced a similar issue and I read in one of the threads about deleting the browser cache and trying it again or trying it in a different browser resolved the issue. Could you try these first level steps and check if you still have these issues?
Ref: https://powerusers.microsoft.com/t5/Common-Data-Service-for-Apps/Default-Environment-Error-on-CDS/m-p/233582#M1281
Also, for your permission error ref: https://powerusers.microsoft.com/t5/Common-Data-Service-for-Apps/Common-Data-Service-Business-Flows/td-p/142053
I have not validated these findings. But as these answers are from MS and PowerApps team, hope it helps!

Support for Flink ACL in YARN

In a secured Hadoop cluster I am trying to access Flink AM page and logs from YARN and seeing the following error:
User %remoteUser are not authorized to view application %appID
Seems like that the cause is lack of support of ACL in YARN from Flink side.
How the code works
The message comes from hadoop/yarn/server/AppBlock class which uses ApplicationACLsManager class. This class performs checks and refers to app info which was set in RMAppManager:
this.applicationACLsManager.addApplication(applicationId,
submissionContext.getAMContainerSpec().getApplicationACLs()
AMContainerSpec is ContainerLaunchContext class which has a PB implementation, submitted from the framework side.
From Flink, this object is created in AbstractYarnClusterDescriptor class which (and other classes in Flink) doesn't call setApplicationACLs.
Question
Is there a way to bypass this or the right solution is to contribute the support to Flink? What is the state of this feature from the Flink side?
This sounds like a limitation in Flink which we should fix. Please open a JIRA issue. The community would be very happy if you could help implementing it.

Spark WebUI Application application_xyz not found

When I am trying to open the history of any of the spark job I am facing this issue "Application_id: Application application_xyz not found",
NOTE:
previously I figured out this as one of the spark history folder was full so this error occurred but now I don't remember how to do it?
Any Help is much appreciated.
Tobe able to access Spark UI after application has finished you need a separate history server.
Please start the server
$SPARK_HOME/sbin/start-history-server.sh
and follow configuration notes.

elasticsearch-hadoop 1.3 M3 proxy support not working

i am a beginner in elasticsearch and hadoop. i am having a problem with moving data from hdfs into elasticsearch server using es.net.proxy.http.host with credentials. Server secured with credentials using nginx proxy configuration. But when i am trying to move data using pig script it shows null pointer exception.
My pig script is
REGISTER elasticsearch-hadoop-1.3.0.M3/dist/elasticsearch-hadoop-1.3.0.M3.jar
A = load 'date' using PigStorage() as (date:datetime);
store A into 'doc/id' using org.elasticsearch.hadoop.pig.EsStorage('es.net.proxy.http.host=ipaddress','es.net.proxy.http.port=portnumber','es.net.proxy.http.user=username','es.net.proxy.http.pass=password');
I don't understand where is the problem with my script. Can anyone please help me?
Thanks in Advance.
I faced this type of problem.elasticsearch-hadoop-1.3.0.M3.jar should not support the Proxy and authentication setting.You will try to elasticsearch-hadoop-1.3.0.BUILDSHAPSHOT.jar file. But I couldn't move object data like Totuble to Production server with authentication
Thank U

Resources