Not able to generate extracted_queries.json using persistgraphql - graphql

I am trying to generate extracted_queries.json file using persistgraphql cli utility but when I am running the following command
persistgraphql src/app/ --add_typename
it results in this error
Unable to process input path src/app/. Error message:
Syntax Error GraphQL request (1:1) Unexpected <EOF>
1:
^
How can I fix this?

After looking for answers for a while I am able to make it work.
In my case, we have queries and mutations written in js files so below command worked for me
persistgraphql src/app --js --extension=js --add_typename

Related

Can't use 'put'() to add data to hbase with happybase

My python version is 3.7, and after I ran pip3 install happybase, I started the command hbase thrift start and tried to write a brief .py file as following:
import happybase
connection = happybase.Connection('master')
table = connection.table('jmlr') #'jmlr' is a table in hbase
for i in table.scan():
print(i)
table.put('001', {'title':'dasds'}) #error here
connection.close()
When it's about to run table.put(), it reported such an error:
thriftpy2.transport.base.TTransportException: TTransportException(type=4, message='TSocket read 0 bytes')
And at the same time, the thrift reported an error:
ERROR [thrift-worker-1] thrift.TBoundedThreadPoolServer: Error occurred during processing of message. java.lang.IllegalArgumentException: Invalid famAndQf provided.
But just now I ran this python file again, it gave me a different error in thrift:
thrift.TBoundedThreadPoolServer: Thrift error occurred during processing of message.
org.apache.thrift.protocol.TProtocolException: Bad version in readMessageBegin
I have tried to add parameters like protocol='compact', transport='framed', but this didn't work, even the table.scan() failed.
Everything in the hbase shell is OK, so I can't figure out what went wrong, I'm about to collapse.
I ran into the same issue and found this sollution. You need to add even empty Column Qualifier ( ':' symbol as delimiter between Column Family and Column Qualifier) into put() method:
table.put('001:', {'title':'dasds'})
Also, you have a different error message after second run of script because thrift server is already failed.
I hope it will help you.

Error: command "bash" failed with no error message?

I am using terraform on my Mac system, and terraform apply results with below error
Error: command "bash" failed with no error message
on ssm.tf line 7, in data "external" "ssm-dynamic-general":
7: data "external" "ssm-dynamic-general" {
However there is nothing wrong in ssm.tf file, same runs perfectly fine in my another system.
Can some one please let me know what i am missing here?
You might have done what I accidentally did: not follow the external program protocol:
https://www.terraform.io/docs/providers/external/data_source.html#external-program-protocol
In my particular case, I failed to send the errors that were coming from my program to standard error. Instead, those errors were going to standard out.
That's why Terraform wasn't able to report on those errors.
So if you send any and all errors from your program to standard error using > &2, you should be able to see those errors when you run terraform plan.

Errors When Running Titanium-Connection-Test

Video documentation on the Appcelerator website explains that if I'm running into errors during the titanium-connection-test that we should ask for support.
Currently I am getting two errors:
Testing logging in against api.appcelerator.net using cURL
Failed: Unexpected end of input
Testing logging in against dashboard.appcelerator.com using cURL
Failed: Unexpected end of input
All other tests passed green. Any idea how I can fix these issues?
Video Source: http://videos.appcelerator.com/video/XmWEDgSP/Installing-Appcelerator-Studio/bSvBEOmi/Get-Started

PIG Cassandra ERROR 2118 Could not get input splits

I started off trying to do simple pig+cassandra integration with this tutorial from datastax: http://docs.datastax.com/en/datastax_enterprise/4.5/datastax_enterprise/ana/anaPigExRel.html
but when i try to store the result into cql, i get this error:
Message: org.apache.pig.backend.executionengine.ExecException: ERROR
2118: Could not get input splits
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:279)
any ideas whats happening? i read some answers here, referring to changing my PIG_PARTITIONER to Murmur3Partitioner
which i already did and it still happens. is it configuration issue?
export PIG_PARTITIONER=org.apache.cassandra.dht.Murmur3Partitioner
I found out that after doing:
export PIG_PARTITIONER=org.apache.cassandra.dht.Murmur3Partitioner
i need to do source ~/.bashrc and do pig from that particular console.
though I get another error, but I think this case is solved.

Oracle Exadata Error: CellCli not Found Error

I am trying to run cellcli on my one of the Exadata Cell Server.
When I login to the Server, I am able to see all the files as expected.
(Like: all_group all_nodelist_group cell_group all_ib_group etc)
When I issue command to start cellcli it gives me error that command not found:
# cellcli
-bash: cellcli: command not found
# which cellcli
which: no cellcli in (/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin)
Any idea what is the location of the cellclie executable on exadata?
Do I need to export any other path to get this command?
cellcli is in /opt/oracle/cell/cellsrv/bin. It should be put in path by /etc/profile.d/cell_env.sh
(from Marc Fielding)

Resources