I'm trying to install my web application on Centos7.
I've tested on my Windows and everything worked fine.
I've followed oracle instruction. However, it's impossible to connect to DB.
There's an already running program in the machine so I do not believe oracle is not installed as the error says. Please check blow and help.
Added into .bashrc:
export LD_LIBRARY_PATH=/home/bcweb/oracleclient/instantclient_19_3
created oracleclient folder and added all the required file
2.
-r--r--r-- 1 bcweb oinstall 5780 4월 17 2019 BASIC_LICENSE
-rw-r--r-- 1 bcweb oinstall 1632 4월 17 2019 BASIC_README
-rw-r--r-- 1 bcweb oinstall 41840 4월 17 2019 adrci
-rw-r--r-- 1 bcweb oinstall 59296 4월 17 2019 genezi
lrwxrwxrwx 1 bcweb oinstall 17 3월 29 08:18 libcintsh.so -> libclntsh.so.19.3
-rw-r--r-- 1 bcweb oinstall 17 4월 17 2019 libclntsh.so
-rw-r--r-- 1 bcweb oinstall 17 4월 17 2019 libclntsh.so.10.1
-rw-r--r-- 1 bcweb oinstall 17 4월 17 2019 libclntsh.so.11.1
-rw-r--r-- 1 bcweb oinstall 17 4월 17 2019 libclntsh.so.12.1
-rw-r--r-- 1 bcweb oinstall 17 4월 17 2019 libclntsh.so.18.1
-rw-r--r-- 1 bcweb oinstall 79961792 4월 17 2019 libclntsh.so.19.1
-rw-r--r-- 1 bcweb oinstall 8041608 4월 17 2019 libclntshcore.so.19.1
-r--r--r-- 1 bcweb oinstall 3609536 4월 17 2019 libipc1.so
-r--r--r-- 1 bcweb oinstall 478432 4월 17 2019 libmql1.so
-rw-r--r-- 1 bcweb oinstall 6587832 4월 17 2019 libnnz19.so
-rw-r--r-- 1 bcweb oinstall 15 4월 17 2019 libocci.so
-rw-r--r-- 1 bcweb oinstall 15 4월 17 2019 libocci.so.10.1
-rw-r--r-- 1 bcweb oinstall 15 4월 17 2019 libocci.so.11.1
-rw-r--r-- 1 bcweb oinstall 15 4월 17 2019 libocci.so.12.1
-rw-r--r-- 1 bcweb oinstall 15 4월 17 2019 libocci.so.18.1
-rw-r--r-- 1 bcweb oinstall 2339896 4월 17 2019 libocci.so.19.1
-rw-r--r-- 1 bcweb oinstall 130515320 4월 17 2019 libociei.so
-r--r--r-- 1 bcweb oinstall 153624 4월 17 2019 libocijdbc19.so
-rw-r--r-- 1 bcweb oinstall 115976 4월 17 2019 liboramysql19.so
drwxr-xr-x 3 bcweb oinstall 19 8월 27 2020 network
-rw-r--r-- 1 bcweb oinstall 4210517 4월 17 2019 ojdbc8.jar
-rw-r--r-- 1 bcweb oinstall 1680074 4월 17 2019 ucp.jar
-rw-r--r-- 1 bcweb oinstall 236960 4월 17 2019 uidrvci
-rw-r--r-- 1 bcweb oinstall 74263 4월 17 2019 xstreams.jar
error:
Error: DPI-1047: Cannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared object file: No such file or directory". See https://oracle.github.io/node-oracledb/INSTALL.html for help
Node-oracledb installation instructions: https://oracle.github.io/node-oracledb/INSTALL.html
You must have 64-bit Oracle client libraries in LD_LIBRARY_PATH, or configured with ldconfig.
If you do not have Oracle Database on this computer, then install the Instant Client Basic or Basic Light package from
http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html
at OracleDb.getConnection (/home/bcweb/web/backend/node_modules/oracledb/lib/oracledb.js:272:25)
at OracleDb.getConnection (/home/bcweb/web/backend/node_modules/oracledb/lib/util.js:176:19)
Adding LD_LIBRARY_PATH to your Bash configuration files is a potential failure point - there are many places where this wouldn't have an effect. It's better to use ldconfig as shown in the node-oracledb installation instructions with Instant Client ZIP files. However, if you have other Oracle software (such as the database itself) on your computer, then you wouldn't do this. Update your question with more details if this is the case, and you want help.
Finally, Instant Client 19.10 is out, so there's no reason to still be using 19.3.
Related
I have the vagrant 'virtualbox' virtual machine, however I can no longer access vagrant with another pc to retrieve my information contained in the virtual machine.
'ubuntu/focus'
how can I proceed?
In your project directory (the one where you have the Vagrantfile) vagrant will create . .vagrant folder. The content of this folder has the Virtalbox vm information
for example:
ls -all .vagrant/machines/default/virtualbox/
total 72
drwxr-xr-x 11 fhenri staff 352 Aug 28 22:20 .
drwxr-xr-x 3 fhenri staff 96 Aug 27 22:38 ..
-rw-r--r-- 1 fhenri staff 40 Aug 28 22:20 action_provision
-rw-r--r-- 1 fhenri staff 10 Aug 28 22:18 action_set_name
-rw-r--r-- 1 fhenri staff 148 Sep 6 09:42 box_meta
-rw-r--r-- 1 fhenri staff 3 Aug 28 22:18 creator_uid
-rw-r--r-- 1 fhenri staff 36 Aug 28 22:18 id
-rw-r--r-- 1 fhenri staff 32 Aug 28 22:18 index_uuid
-rw------- 1 fhenri staff 1679 Aug 28 22:18 private_key
-rw-r--r-- 1 fhenri staff 316 Sep 6 10:18 synced_folders
-rw-r--r-- 1 fhenri staff 63 Aug 28 22:18 vagrant_cwd
the name of the vagrant machine is default and provider is virtual box in this case.
If you copy the virtual box VM into another PC, you also need to copy this folder or need to recreate it. It contains the id of virtual box VM which helps vagrant to map and link with the VM.
I'm following the Anchor docs here, but I keep getting this error...
BPF SDK path does not exist: /Users/herbie/.cargo/bin/sdk/bpf: No such file or directory (os error 2)
I ran ls -al /Users/herbie/.cargo/bin and got this output:
total 239152
drwxr-xr-x 17 herbie staff 544 31 Jan 16:55 .
drwxr-xr-x 9 herbie staff 288 13 Dec 11:58 ..
-rwxr-xr-x 1 herbie staff 12574724 31 Jan 16:49 anchor
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 cargo
-rwxr-xr-x 1 herbie staff 7578989 14 Dec 14:05 cargo-build-bpf
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 cargo-clippy
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 cargo-fmt
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 cargo-miri
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 clippy-driver
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 rls
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 rust-gdb
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 rust-lldb
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 rustc
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 rustdoc
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 rustfmt
-rwxr-xr-x 12 herbie staff 8521112 31 Jan 16:55 rustup
Haven't found much online, and have never heard of BPF before...
It's unclear when you're getting the error during the installation, but here's a few things to try:
be sure that you're using an up-to-date version of Rust stable with rustup update stable
check that you're using the Solana CLI version designated in the docs using solana -V
run cargo build-bpf on the hello world Rust application: https://github.com/solana-labs/example-helloworld/tree/master/src/program-rust
For more reference, BPF is the bytecode format used by on-chain programs with Solana. You can find some more info at the links contained within https://docs.solana.com/developing/on-chain-programs/overview#berkeley-packet-filter-bpf
Try removing the solana cache before running your code. It worked for me. Basically, the BPF SDK hasn't been installed accurately.
rm -rf ~/.cache/solana/*
After deleting the solana cache run. It should download the BPF SDK again
solana build
What is the location of all .vbox files that are created in virtual box in a mac. And how can we delete it?
I mistakenly deleted virtualbox.app file from Applications and now I want to delete all the remains of all virtual box files.
I tried following things:
drwxr-xr-x# 3 user admin 102 Dec 19 19:57 vagrant
-rw-r--r-- 1 root wheel 1824496 Feb 21 17:17 com.vagrant.vagrant.bom
-rw-r--r-- 1 root wheel 240 Feb 21 17:17 com.vagrant.vagrant.plist
drwx------ 5 root wheel 170 Feb 21 16:59 ubuntu-cloudimg-precise-vagrant-amd64_1487725164672_27815
./private/var/root/VirtualBox VMs/ubuntu-cloudimg-precise-vagrant-amd64_1487725164672_27815:
-rw------- 1 root wheel 3036 Feb 21 16:59 ubuntu-cloudimg-precise-vagrant-amd64_1487725164672_27815.vbox
-rw------- 1 root wheel 3036 Feb 21 16:59 ubuntu-cloudimg-precise-vagrant-amd64_1487725164672_27815.vbox-prev
lrwxr-xr-x 1 root wheel 24 Feb 21 17:17 vagrant -> /opt/vagrant/bin/vagrant
But I could not find the location of .vbox files to manually delete them.
By default the .vbox files normally go into your users' home directory:
pwd
/Users/Astro58/VirtualBox VMs
total 264
drwx------# 6 lance staff 204 Feb 22 19:03 ./
drwxr-xr-x+ 88 lance staff 2992 Feb 22 16:11 ../
drwx------ 6 lance staff 204 Feb 22 19:50 Laravel_default_1487808229046_76286/
drwx------ 7 lance staff 238 Feb 20 12:53 legacy_default_1487457532080_39585/
You should be able to delete the "VirtualBox VMs" directory via the rm command or via the Mac Finder.
How do I add the lz4 native libraries for use by Spark workers?
I have tried to add them via both LD_LIBRARY_PATH and ( as shown - but no accepted or even upvoted answer - in Apache Spark Native Libraries ) - in SPARK_LIBRARY_PATH. They are not working: we get:
java.lang.RuntimeException: native lz4 library not available
at org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:165)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1201)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1094)
at org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1444)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:277)
at BIDMat.HDFSIO.writeThing(HDFSIO.scala:96)
Here is the LD_LIBRARY_PATH
$echo $LD_LIBRARY_PATH
/usr/local/Cellar/lz4/r131/lib:/usr/local/Cellar/hadoop/2.7.2/libexec/lib:
12:15:35/BIDMach_Spark $ll /usr/local/Cellar/lz4/r131/lib
and the contents of the lz4 related entry:
$ll /usr/local/Cellar/lz4/r131/lib
total 528
-r--r--r-- 1 macuser admin 71144 Sep 21 2015 liblz4.a
drwxr-xr-x 7 macuser admin 238 Sep 21 2015 .
drwxr-xr-x 3 macuser admin 102 Jun 13 10:41 pkgconfig
-r--r--r-- 1 macuser admin 64120 Jun 13 10:41 liblz4.dylib
-r--r--r-- 1 macuser admin 64120 Jun 13 10:41 liblz4.1.dylib
-r--r--r-- 1 macuser admin 64120 Jun 13 10:41 liblz4.1.7.1.dylib
Update your hadoop jars and should work perfectly fine.
Im trying to setup a hadoop multi node cluster.When I checked the installation folders I dont see a folder called /conf.
I see a /etc directory which has a hadoop folder and inside it I see a number of xml files -
core-site.xml
hdfs-site.xml
master ...etc
but nothing as master file or conf folder
The etc/hadoop directory structure is expected for current Apache Hadoop 2.x releases. You won't get a conf directory. Here is what the directory structure looks like for a fresh install of Apache Hadoop 2.7.1.
> curl https://www.apache.org/dist/hadoop/common/hadoop-2.7.1/hadoop-2.7.1.tar.gz 2>/dev/null | tar xf -
> ls -l hadoop-2.7.1/
total 48
-rw-r--r-- 1 chris wheel 15K Jun 28 2015 LICENSE.txt
-rw-r--r-- 1 chris wheel 101B Jun 28 2015 NOTICE.txt
-rw-r--r-- 1 chris wheel 1.3K Jun 28 2015 README.txt
drwxr-xr-x 13 chris wheel 442B Jun 28 2015 bin/
drwxr-xr-x 3 chris wheel 102B Jun 28 2015 etc/
drwxr-xr-x 7 chris wheel 238B Jun 28 2015 include/
drwxr-xr-x 3 chris wheel 102B Jun 28 2015 lib/
drwxr-xr-x 12 chris wheel 408B Jun 28 2015 libexec/
drwxr-xr-x 30 chris wheel 1.0K Jun 28 2015 sbin/
drwxr-xr-x 4 chris wheel 136B Jun 28 2015 share/
> ls -l hadoop-2.7.1/etc/hadoop/
total 304
-rw-r--r-- 1 chris wheel 4.3K Jun 28 2015 capacity-scheduler.xml
-rw-r--r-- 1 chris wheel 1.3K Jun 28 2015 configuration.xsl
-rw-r--r-- 1 chris wheel 318B Jun 28 2015 container-executor.cfg
-rw-r--r-- 1 chris wheel 774B Jun 28 2015 core-site.xml
-rw-r--r-- 1 chris wheel 3.6K Jun 28 2015 hadoop-env.cmd
-rw-r--r-- 1 chris wheel 4.1K Jun 28 2015 hadoop-env.sh
-rw-r--r-- 1 chris wheel 2.4K Jun 28 2015 hadoop-metrics.properties
-rw-r--r-- 1 chris wheel 2.5K Jun 28 2015 hadoop-metrics2.properties
-rw-r--r-- 1 chris wheel 9.5K Jun 28 2015 hadoop-policy.xml
-rw-r--r-- 1 chris wheel 775B Jun 28 2015 hdfs-site.xml
-rw-r--r-- 1 chris wheel 1.4K Jun 28 2015 httpfs-env.sh
-rw-r--r-- 1 chris wheel 1.6K Jun 28 2015 httpfs-log4j.properties
-rw-r--r-- 1 chris wheel 21B Jun 28 2015 httpfs-signature.secret
-rw-r--r-- 1 chris wheel 620B Jun 28 2015 httpfs-site.xml
-rw-r--r-- 1 chris wheel 3.4K Jun 28 2015 kms-acls.xml
-rw-r--r-- 1 chris wheel 1.5K Jun 28 2015 kms-env.sh
-rw-r--r-- 1 chris wheel 1.6K Jun 28 2015 kms-log4j.properties
-rw-r--r-- 1 chris wheel 5.4K Jun 28 2015 kms-site.xml
-rw-r--r-- 1 chris wheel 11K Jun 28 2015 log4j.properties
-rw-r--r-- 1 chris wheel 951B Jun 28 2015 mapred-env.cmd
-rw-r--r-- 1 chris wheel 1.4K Jun 28 2015 mapred-env.sh
-rw-r--r-- 1 chris wheel 4.0K Jun 28 2015 mapred-queues.xml.template
-rw-r--r-- 1 chris wheel 758B Jun 28 2015 mapred-site.xml.template
-rw-r--r-- 1 chris wheel 10B Jun 28 2015 slaves
-rw-r--r-- 1 chris wheel 2.3K Jun 28 2015 ssl-client.xml.example
-rw-r--r-- 1 chris wheel 2.2K Jun 28 2015 ssl-server.xml.example
-rw-r--r-- 1 chris wheel 2.2K Jun 28 2015 yarn-env.cmd
-rw-r--r-- 1 chris wheel 4.5K Jun 28 2015 yarn-env.sh
-rw-r--r-- 1 chris wheel 690B Jun 28 2015 yarn-site.xml
If you are moving to Apache Hadoop 2.x from Apache Hadoop 1.x, then you might be expecting to see an older directory layout, which did have a conf directory. Here is what it looks like for Apache Hadoop 1.2.1.
> curl https://www.apache.org/dist/hadoop/common/hadoop-1.2.1/hadoop-1.2.1-bin.tar.gz 2>/dev/null | tar xf -
> ls -l hadoop-1.2.1/
total 16680
-rw-r--r-- 1 chris wheel 482K Jul 22 2013 CHANGES.txt
-rw-r--r-- 1 chris wheel 13K Jul 22 2013 LICENSE.txt
-rw-r--r-- 1 chris wheel 101B Jul 22 2013 NOTICE.txt
-rw-r--r-- 1 chris wheel 1.3K Jul 22 2013 README.txt
drwxr-xr-x 19 chris wheel 646B Jul 22 2013 bin/
-rw-r--r-- 1 chris wheel 118K Jul 22 2013 build.xml
drwxr-xr-x 4 chris wheel 136B Jul 22 2013 c++/
drwxr-xr-x 19 chris wheel 646B Jul 22 2013 conf/
drwxr-xr-x 10 chris wheel 340B Jul 22 2013 contrib/
-rw-r--r-- 1 chris wheel 6.7K Jul 22 2013 hadoop-ant-1.2.1.jar
-rw-r--r-- 1 chris wheel 414B Jul 22 2013 hadoop-client-1.2.1.jar
-rw-r--r-- 1 chris wheel 4.0M Jul 22 2013 hadoop-core-1.2.1.jar
-rw-r--r-- 1 chris wheel 139K Jul 22 2013 hadoop-examples-1.2.1.jar
-rw-r--r-- 1 chris wheel 417B Jul 22 2013 hadoop-minicluster-1.2.1.jar
-rw-r--r-- 1 chris wheel 3.0M Jul 22 2013 hadoop-test-1.2.1.jar
-rw-r--r-- 1 chris wheel 377K Jul 22 2013 hadoop-tools-1.2.1.jar
drwxr-xr-x 13 chris wheel 442B Jul 22 2013 ivy/
-rw-r--r-- 1 chris wheel 10K Jul 22 2013 ivy.xml
drwxr-xr-x 52 chris wheel 1.7K Jul 22 2013 lib/
drwxr-xr-x 4 chris wheel 136B Jul 22 2013 libexec/
drwxr-xr-x 9 chris wheel 306B Jul 22 2013 sbin/
drwxr-xr-x 3 chris wheel 102B Jul 22 2013 share/
drwxr-xr-x 3 chris wheel 102B Dec 29 09:44 src/
drwxr-xr-x 9 chris wheel 306B Jul 22 2013 webapps/
> ls -l hadoop-1.2.1/conf/
total 160
-rw-r--r-- 1 chris wheel 7.3K Jul 22 2013 capacity-scheduler.xml
-rw-r--r-- 1 chris wheel 1.1K Jul 22 2013 configuration.xsl
-rw-r--r-- 1 chris wheel 178B Jul 22 2013 core-site.xml
-rw-r--r-- 1 chris wheel 327B Jul 22 2013 fair-scheduler.xml
-rw-r--r-- 1 chris wheel 2.4K Jul 22 2013 hadoop-env.sh
-rw-r--r-- 1 chris wheel 2.0K Jul 22 2013 hadoop-metrics2.properties
-rw-r--r-- 1 chris wheel 4.5K Jul 22 2013 hadoop-policy.xml
-rw-r--r-- 1 chris wheel 178B Jul 22 2013 hdfs-site.xml
-rw-r--r-- 1 chris wheel 4.9K Jul 22 2013 log4j.properties
-rw-r--r-- 1 chris wheel 2.0K Jul 22 2013 mapred-queue-acls.xml
-rw-r--r-- 1 chris wheel 178B Jul 22 2013 mapred-site.xml
-rw-r--r-- 1 chris wheel 10B Jul 22 2013 masters
-rw-r--r-- 1 chris wheel 10B Jul 22 2013 slaves
-rw-r--r-- 1 chris wheel 2.0K Jul 22 2013 ssl-client.xml.example
-rw-r--r-- 1 chris wheel 1.9K Jul 22 2013 ssl-server.xml.example
-rw-r--r-- 1 chris wheel 3.8K Jul 22 2013 task-log4j.properties
-rw-r--r-- 1 chris wheel 382B Jul 22 2013 taskcontroller.cfg
However, in the Apache Hadoop 2.x distro, you won't get a conf directory.
The contents of conf in Hadoop 1.x and etc/hadoop in Hadoop 2.x are somewhat similar. You'll see the various *-site.xml files. More details on configuration are available in the Cluster Setup guide.