How to import data in .cypherl format using mgconsole to a Memgraph that runs in WSL? - memgraphdb

I have a Memgraph running inside WSL (Ubuntu distribution). I have the data in .cypherl format. I don't possibility to connect with Memgraph Lab to this instance.
How can I import my data?

Data import from .cypherl file can be done using the mgconsole. Open a terminal in your Ubuntu, go to a folder where your .cypherl file is located and write the following command:
mgconsole < queries.cypherl

Related

How to import data in .cypherl format to a Memgraph that runs in Docker?

I have a Memgraph installed using docker. I didn't use Memgraph Platform docker image, so I don't have a Memgraph Lab.
I have data in .cypherl format. How can I import it into Memgraph?
The exact procedure/command depends on your operating system. If you are using Linux or macOS your command would be:
docker run -i --entrypoint=mgconsole memgraph/memgraph-platform --host HOST < queries.cypherl
If you are running Docker under Windows then the command is:
cmd.exe /c "docker run -i --entrypoint=mgconsole memgraph/memgraph-platform --host HOST < queries.cypherl"
Be sure to replace HOST with valid IP of the container and to define the correct Memgraph Docker image you are using as well as the correct path to the file.

Export Image Google Cloud Engine (gce) to use the image local with VirtualBox

How can I export a gce Image to use it in a local Virtualbox?
I get the error:
error: No such device
gce-image-export.vmdk
gce-image-export.qcow2
gce-image-export.vdi
I use the command:
qemu-img convert -O vdi gce-image-export.qcow2 gce-image-export.vdi
I get by *.vmdk, *.qcow2, *.vdi all the the same error.
Did you have input for me?
Thanks
kivitendo
You can export the image using the gcloud command. You can see in the following documentation all the use of the command and the flags.
gcloud compute images export \
--destination-uri <destination-uri> \
--image <image-name> \
--export-format <format>
The --export-format flag exports the image to a format supported by QEMU using qemu-img. Valid formats include 'vmdk', 'vhdx', 'vpc', 'vdi', and 'qcow2'.
You can send it to a Bucket and later you can download it.
thanks, i only can export from Google Cloud Platform (gce) to
*.VMDK
*.VHDX
*.VPC
*.qcow2
formats.
I must Virtualbox 6.1 change Virtualbox Machine to EFI Support.
If i use than rEFInd 0.12 as boothelper i can start my vce *.vmdk machine.
I get many errormessage and i don't can log in, in my vce *.vmdk machine. to repair the errormessage and install grub-efi to my vce *.vmdk machine.
My installed NextCloud server is starting.
How can i log in, to my machine?
root doesn't work.
I don't find any tutorial.
kivitendo

AWS Boto3: Code works in the IPython console but not in a Jupyter Notebook

I am trying to take notes while I am studying boto3 and I want to use Jupyter. The below code works in the interactive console but it fails with
EndpointConnectionError: Could not connect to the endpoint URL:
"https://ec2.Central.amazonaws.com/"
When I try it in Jupyter. I suspect that it is because of Jupyter not being able to find the config and credentials files but I am not sure, the message is not saying exactly that
import boto3
ec2=boto3.resource('ec2')
response = ec2.create_vpc(
CidrBlock='10.0.0.0/16',
)
print(response)
You could always provide your credentials to resource explicitly:
ec2=boto3.resource(
'ec2',
region_name='REGION_NAME',
aws_access_key_id='AWS_ACCESS_KEY_ID',
aws_secret_access_key='AWS_SECRET_ACCESS_KEY'
)
To get this working I had to create a system variable that holds the path to the configure file. The solution suggested by #scangetti is not secure.

Import Modules in Nifi ExecuteScript

I am new to Nifi and python
i want to execute my python script. So used ExecuteScript and tried to import certain modules. I have imported like this:
import json, sftp, paramiko
Though i have sftp installed, When i import it in Executescript, it says "Failed to process session. No module named sftp at line number 1"
which -a sftp
/usr/bin/sftp
When importing paramiko also, got the same error.
The "python" engine used by ExecuteScript and InvokeScriptedProcessor is actually Jython, not pure Python. This means it cannot load native modules (.so files, compiled C files, etc.). According to this SO post, paramiko uses Crypto which has native libraries, so cannot be used in Jython (see the bottom of this post for my comment on that). My guess is that the sftp library does the same.
Jython can make use of pure Python modules, there is a discussion on the NiFi mailing list about how to point at (and include) those kinds of modules.
ExecuteScript processor uses its own Jython Engine to execute your python scripts. As the libraries which you are importing are not available in NIFI inbuild Jython Engine its throwing error.
SOLUTION:
If python is already installed on our machine with all those libraries (the same machine where your NIFI is installed) you can use that python engine to execute your script. you can execute your python code using ExecuteProcess processor. see the configuration of ExecuteProcess.
If its really important that you use python. You can use ExecuteStreamCommand. It will run the python code using the python engine installed on your machine.
The disadvantage is that you cannot access attributes of flowfile inside your python code. Only its content.
To access the content,
import sys
data = sys.stdin.readlines()
and to pass the content to the next processor, just print your output.
print("THIS IS MY OUTPUT, IT WILL BE PASSED AS CONTENT TO THE NEXT PROCESSOR")
otherwise, if you need to stick with ExecuteScript, Use groovy, that will save you a lot of headache.

remotely running the "show()" command on matplotlib.pyplot

I'm currently trying to run a python script whose end goal is to show a figure. It skeletally looks like:
import matplotlib.pyplot as p
p.figure()
[build figure, create plots]
p.show()
I'm trying to run this script remotely. It's located on another machine, and I'm trying to run it while ssh'd from my laptop, using:
$ ssh -X 'myusername'#'myhostname'
However, whenever I execute my script, I get the following error, raised by p.show():
This program needs access to the screen.
Please run with 'pythonw', not 'python', and only when you are logged
in on the main display of your Mac.
When I run with pythonw instead of python, I get the same error. Is there any way to configure matplotlib and ssh to be able to show or save plot files on remote machines?

Resources