Ansible ansible.builtin.url module "Name does not resolve" error - debugging

I am trying to use the ansible.builtin.url module to fetch the contents of a remote file on my remote server. I am using one of the examples in the official documentation page of the module, but occasionally I am getting the Name does not resolve error. I am saying occasionally because this behaviour is not consistent. Sometimes I get the results, sometimes I get the error...
How do you suggest I debug this error?
Ansible task
- name: url lookup splits lines by default
debug: msg="{{item}}"
loop: "{{ lookup('url', 'https://ip-ranges.amazonaws.com/ip-ranges.json', wantlist=True) }}"
Playbook output
TASK [url lookup splits lines by default] *********************************************************************************************
fatal: [my.domain.name]: FAILED! => {"msg": "An unhandled exception occurred while running the lookup plugin 'url'. Error was a <class 'ansible.errors.AnsibleError'>, original message: Failed lookup url for https://ip-ranges.amazonaws.com/ip-ranges.json : <urlopen error [Errno -2] Name does not resolve>"}
Ansible version
$ ansible --version
ansible 2.10.2
config file = /etc/ansible/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.6 (default, Oct 5 2020, 00:23:48) [GCC 10.2.0]
Python packages versions
$ pip3 list
Package Version
---------------------- --------------------
appdirs 1.4.3
attrs 19.3.0
Automat 0.8.0
blinker 1.4
certifi 2019.11.28
chardet 3.0.4
Click 7.0
cloud-init 20.3
colorama 0.4.3
command-not-found 0.3
configobj 5.0.6
constantly 15.1.0
cryptography 2.8
dbus-python 1.2.16
distlib 0.3.0
distro 1.4.0
distro-info 0.23ubuntu1
entrypoints 0.3
fail2ban 0.11.1
filelock 3.0.12
httplib2 0.14.0
hyperlink 19.0.0
idna 2.8
importlib-metadata 1.5.0
incremental 16.10.1
Jinja2 2.10.1
jsonpatch 1.22
jsonpointer 2.0
jsonschema 3.2.0
keyring 18.0.1
language-selector 0.1
launchpadlib 1.10.13
lazr.restfulclient 0.14.2
lazr.uri 1.0.3
MarkupSafe 1.1.0
more-itertools 4.2.0
netifaces 0.10.4
oauthlib 3.1.0
pexpect 4.6.0
pip 20.0.2
pyasn1 0.4.2
pyasn1-modules 0.2.1
PyGObject 3.36.0
PyHamcrest 1.9.0
pyinotify 0.9.6
PyJWT 1.7.1
pymacaroons 0.13.0
PyNaCl 1.3.0
pyOpenSSL 19.0.0
pyrsistent 0.15.5
pyserial 3.4
python-apt 2.0.0+ubuntu0.20.4.1
python-debian 0.1.36ubuntu1
PyYAML 5.3.1
requests 2.22.0
requests-unixsocket 0.2.0
SecretStorage 2.3.1
service-identity 18.1.0
setuptools 45.2.0
simplejson 3.16.0
six 1.14.0
sos 4.0
ssh-import-id 5.10
systemd-python 234
Twisted 18.9.0
ubuntu-advantage-tools 20.3
ufw 0.36
unattended-upgrades 0.1
urllib3 1.25.8
virtualenv 20.0.17
wadllib 1.3.3
wheel 0.34.2
zipp 1.0.0
zope.interface 4.7.1
PS: Running dig ip-ranges.amazonaws.com on the target machine returns this output:
$ dig ip-ranges.amazonaws.com
; <<>> DiG 9.16.1-Ubuntu <<>> ip-ranges.amazonaws.com
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 16015
;; flags: qr rd ra; QUERY: 1, ANSWER: 5, AUTHORITY: 0, ADDITIONAL: 1
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 65494
;; QUESTION SECTION:
;ip-ranges.amazonaws.com. IN A
;; ANSWER SECTION:
ip-ranges.amazonaws.com. 900 IN CNAME d3mduebighmd0u.cloudfront.net.
d3mduebighmd0u.cloudfront.net. 59 IN A 13.224.93.97
d3mduebighmd0u.cloudfront.net. 59 IN A 13.224.93.6
d3mduebighmd0u.cloudfront.net. 59 IN A 13.224.93.36
d3mduebighmd0u.cloudfront.net. 59 IN A 13.224.93.58
;; Query time: 40 msec
;; SERVER: 127.0.0.53#53(127.0.0.53)
;; WHEN: Thu Nov 19 11:42:03 UTC 2020
;; MSG SIZE rcvd: 159

The problem had to do with the OS version the controller was running. Changing it from alpine:edge to alpine:3.12.1 fixed the problem. This issue helped: https://github.com/gliderlabs/docker-alpine/issues/539

Related

(Windows 10) Miniconda Pyopencl - ImportError: DLL load failed while importing _cl: The specified procedure could not be found

I've been working with pyopencl on my Macbook. I installed miniconda and installed pyopencl from there by using these instructions.
On my Mac everything works fine, I can run a small example program and it works.
I then tried to use pyopencl on my Windows 10 desktop PC as that has a proper GPU (Nvidia GeForce GTX 1080TI) but I can't seem to get it it work with the same example program.
Like with the Mac I followed these instructions and installed miniconda and then pyopencl.
However when I run my example program I get the following error:
C:\Users\Grant\Desktop\Exercises-Solutions-1.2.1\Exercises\Exercise03\Python>python vadd.py
Traceback (most recent call last):
File "vadd.py", line 13, in <module>
import pyopencl as cl
File "C:\ProgramData\Miniconda3\lib\site-packages\pyopencl\__init__.py", line 29, in <module>
import pyopencl.cltypes # noqa: F401
File "C:\ProgramData\Miniconda3\lib\site-packages\pyopencl\cltypes.py", line 22, in <module>
from pyopencl.tools import get_or_register_dtype
File "C:\ProgramData\Miniconda3\lib\site-packages\pyopencl\tools.py", line 37, in <module>
from pyopencl._cl import bitlog2 # noqa: F401
ImportError: DLL load failed while importing _cl: The specified procedure could not be found.
This is what I have installed via conda:
C:\Users\Grant\Desktop\Exercises-Solutions-1.2.1\Exercises\Exercise03\Python>conda list
# packages in environment at C:\ProgramData\Miniconda3:
#
# Name Version Build Channel
appdirs 1.4.3 py_1 conda-forge
ca-certificates 2020.6.20 hecda079_0 conda-forge
certifi 2020.6.20 py38h9bdc248_2 conda-forge
cffi 1.14.0 py38h7a1dbc1_0
chardet 3.0.4 py38_1003
conda 4.8.5 py38h9bdc248_2 conda-forge
conda-package-handling 1.6.1 py38h62dcd97_0
console_shortcut 0.1.1 4
cryptography 2.9.2 py38h7a1dbc1_0
decorator 4.4.2 py_0 conda-forge
idna 2.9 py_1
intel-openmp 2020.1 216
khronos-opencl-icd-loader 2020.06.16 h62dcd97_1 conda-forge
libblas 3.8.0 16_mkl conda-forge
libcblas 3.8.0 16_mkl conda-forge
liblapack 3.8.0 16_mkl conda-forge
mako 1.1.3 pyh9f0ad1d_0 conda-forge
markupsafe 1.1.1 py38hab1e662_2 conda-forge
menuinst 1.4.16 py38he774522_0
mkl 2020.1 216
numpy 1.19.2 py38hdf1ac2f_1 conda-forge
openssl 1.1.1h he774522_0 conda-forge
pip 20.0.2 py38_3
powershell_shortcut 0.0.1 3
pycosat 0.6.3 py38he774522_0
pycparser 2.20 py_0
pyopencl 2020.2.2 py38hfd46600_1 conda-forge
pyopenssl 19.1.0 py38_0
pysocks 1.7.1 py38_0
python 3.8.3 he1778fa_0
python_abi 3.8 1_cp38 conda-forge
pytools 2020.4 pyh9f0ad1d_0 conda-forge
pywin32 227 py38he774522_1
requests 2.23.0 py38_0
ruamel_yaml 0.15.87 py38he774522_0
setuptools 46.4.0 py38_0
six 1.14.0 py38_0
sqlite 3.31.1 h2a8f88b_1
tqdm 4.46.0 py_0
urllib3 1.25.8 py38_0
vc 14.1 h0510ff6_4
vs2015_runtime 14.16.27012 hf0eaf9b_1
wheel 0.34.2 py38_0
win_inet_pton 1.1.0 py38_0
wincertstore 0.2 py38_0
yaml 0.1.7 hc54c509_2
zlib 1.2.11 h62dcd97_4
I'm not sure If I'm missing something but it can't seem to find the DLL, so with that I tried ensuring my GPU drivers were up to date, that didn't work so then I tried to install the Nvidia CUDA toolkit. Once again this has no affect and the problem persists.
The only possible clue I have is that in the miniconda installation directory on my Macbook there is a OpenCl folder followed by a vendor folder that has an apple.icd file in.
miniconda3/etc/OpenCL/vendors/apple.icd.
However my miniconda install directory on my Windows PC seems to be missing that OpenCL folder, and by virtue the vendors folder and .icd file.
C:\ProgramData\Miniconda3\etc>dir
Volume in drive C has no label.
Volume Serial Number is 4E38-45B0
Directory of C:\ProgramData\Miniconda3\etc
12/10/2020 17:54 <DIR> .
12/10/2020 17:54 <DIR> ..
12/10/2020 17:54 <DIR> fish
12/10/2020 18:04 <DIR> profile.d
0 File(s) 0 bytes
4 Dir(s) 37,407,539,200 bytes free
This to me points to the cause of the issue, but I'm not sure. And even if it is I don't know what I'd need to do to produce this .icd file.
Any help would be appreciated.

Dask: When reading from HDFS, pyarrow/hdfs.py returns OSError: Getting symbol hdfsNewBuilder failed

I was trying to run dask-on-yarn with my research group's Hadoop cluster.
I tried each of the following instructions:
dd.read_parquet('hdfs://file.parquet', engine='fastparquet')
dd.read_parquet('hdfs://file.parquet', engine='pyarrow')
dd.read_csv('hdfs://file.csv')
Each time, the following error message occurs:
~/miniconda3/envs/dask/lib/python3.8/site-packages/fsspec/core.py in get_fs_token_paths(urlpath, mode, num, name_function, storage_options, protocol)
521 path = cls._strip_protocol(urlpath)
522 update_storage_options(options, storage_options)
--> 523 fs = cls(**options)
524
525 if "w" in mode:
~/miniconda3/envs/dask/lib/python3.8/site-packages/fsspec/spec.py in __call__(cls, *args, **kwargs)
52 return cls._cache[token]
53 else:
---> 54 obj = super().__call__(*args, **kwargs)
55 # Setting _fs_token here causes some static linters to complain.
56 obj._fs_token_ = token
~/miniconda3/envs/dask/lib/python3.8/site-packages/fsspec/implementations/hdfs.py in __init__(self, host, port, user, kerb_ticket, driver, extra_conf, **kwargs)
42 AbstractFileSystem.__init__(self, **kwargs)
43 self.pars = (host, port, user, kerb_ticket, driver, extra_conf)
---> 44 self.pahdfs = HadoopFileSystem(
45 host=host,
46 port=port,
~/miniconda3/envs/dask/lib/python3.8/site-packages/pyarrow/hdfs.py in __init__(self, host, port, user, kerb_ticket, driver, extra_conf)
38 _maybe_set_hadoop_classpath()
39
---> 40 self._connect(host, port, user, kerb_ticket, extra_conf)
41
42 def __reduce__(self):
~/miniconda3/envs/dask/lib/python3.8/site-packages/pyarrow/io-hdfs.pxi in pyarrow.lib.HadoopFileSystem._connect()
~/miniconda3/envs/dask/lib/python3.8/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
OSError: Getting symbol hdfsNewBuilderfailed
How should I resolve this problem?
My Environment
Here are my packages in this conda env:
# Name Version Build Channel
_libgcc_mutex 0.1 main
abseil-cpp 20200225.2 he1b5a44_0 conda-forge
arrow-cpp 0.17.1 py38h1234567_9_cpu conda-forge
attrs 19.3.0 py_0
aws-sdk-cpp 1.7.164 hc831370_1 conda-forge
backcall 0.2.0 py_0
blas 1.0 mkl
bleach 3.1.5 py_0
bokeh 2.1.1 py38_0
boost-cpp 1.72.0 h7b93d67_1 conda-forge
brotli 1.0.7 he6710b0_0
brotlipy 0.7.0 py38h7b6447c_1000
bzip2 1.0.8 h7b6447c_0
c-ares 1.15.0 h7b6447c_1001
ca-certificates 2020.6.24 0
certifi 2020.6.20 py38_0
cffi 1.14.0 py38he30daa8_1
chardet 3.0.4 py38_1003
click 7.1.2 py_0
cloudpickle 1.4.1 py_0
conda-pack 0.4.0 py_0
cryptography 2.9.2 py38h1ba5d50_0
curl 7.71.0 hbc83047_0
cytoolz 0.10.1 py38h7b6447c_0
dask 2.19.0 py_0
dask-core 2.19.0 py_0
dask-yarn 0.8.1 py38h32f6830_0 conda-forge
decorator 4.4.2 py_0
defusedxml 0.6.0 py_0
distributed 2.19.0 py38_0
entrypoints 0.3 py38_0
fastparquet 0.3.2 py38heb32a55_0
freetype 2.10.2 h5ab3b9f_0
fsspec 0.7.4 py_0
gflags 2.2.2 he6710b0_0
glog 0.4.0 he6710b0_0
grpc-cpp 1.30.0 h9ea6770_0 conda-forge
grpcio 1.27.2 py38hf8bcb03_0
heapdict 1.0.1 py_0
icu 67.1 he1b5a44_0 conda-forge
idna 2.10 py_0
importlib-metadata 1.7.0 py38_0
importlib_metadata 1.7.0 0
intel-openmp 2020.1 217
ipykernel 5.3.0 py38h5ca1d4c_0
ipython 7.16.1 py38h5ca1d4c_0
ipython_genutils 0.2.0 py38_0
jedi 0.17.1 py38_0
jinja2 2.11.2 py_0
jpeg 9b h024ee3a_2
json5 0.9.5 py_0
jsonschema 3.2.0 py38_0
jupyter_client 6.1.3 py_0
jupyter_core 4.6.3 py38_0
jupyterlab 2.1.5 py_0
jupyterlab_server 1.1.5 py_0
krb5 1.18.2 h173b8e3_0
ld_impl_linux-64 2.33.1 h53a641e_7
libcurl 7.71.0 h20c2e04_0
libedit 3.1.20191231 h7b6447c_0
libevent 2.1.10 hcdb4288_1 conda-forge
libffi 3.3 he6710b0_1
libgcc-ng 9.1.0 hdf63c60_0
libgfortran-ng 7.3.0 hdf63c60_0
libllvm9 9.0.1 h4a3c616_0
libpng 1.6.37 hbc83047_0
libprotobuf 3.12.3 hd408876_0
libsodium 1.0.18 h7b6447c_0
libssh2 1.9.0 h1ba5d50_1
libstdcxx-ng 9.1.0 hdf63c60_0
libtiff 4.1.0 h2733197_1
llvmlite 0.33.0 py38hd408876_0
locket 0.2.0 py38_1
lz4-c 1.9.2 he6710b0_0
markupsafe 1.1.1 py38h7b6447c_0
mistune 0.8.4 py38h7b6447c_1000
mkl 2020.1 217
mkl-service 2.3.0 py38he904b0f_0
mkl_fft 1.1.0 py38h23d657b_0
mkl_random 1.1.1 py38h0573a6f_0
msgpack-python 1.0.0 py38hfd86e86_1
nbconvert 5.6.1 py38_0
nbformat 5.0.7 py_0
ncurses 6.2 he6710b0_1
notebook 6.0.3 py38_0
numba 0.50.1 py38h0573a6f_0
numpy 1.18.5 py38ha1c710e_0
numpy-base 1.18.5 py38hde5b4d6_0
olefile 0.46 py_0
openssl 1.1.1g h7b6447c_0
packaging 20.4 py_0
pandas 1.0.5 py38h0573a6f_0
pandoc 2.9.2.1 0
pandocfilters 1.4.2 py38_1
parquet-cpp 1.5.1 2 conda-forge
parso 0.7.0 py_0
partd 1.1.0 py_0
pexpect 4.8.0 py38_0
pickleshare 0.7.5 py38_1000
pillow 7.1.2 py38hb39fc2d_0
pip 20.1.1 py38_1
prometheus_client 0.8.0 py_0
prompt-toolkit 3.0.5 py_0
protobuf 3.12.3 py38he6710b0_0
psutil 5.7.0 py38h7b6447c_0
ptyprocess 0.6.0 py38_0
pyarrow 0.17.1 py38h1234567_9_cpu conda-forge
pycparser 2.20 py_0
pygments 2.6.1 py_0
pyopenssl 19.1.0 py38_0
pyparsing 2.4.7 py_0
pyrsistent 0.16.0 py38h7b6447c_0
pysocks 1.7.1 py38_0
python 3.8.3 hcff3b4d_2
python-dateutil 2.8.1 py_0
python_abi 3.8 1_cp38 conda-forge
pytz 2020.1 py_0
pyyaml 5.3.1 py38h7b6447c_1
pyzmq 19.0.1 py38he6710b0_1
re2 2020.07.01 he1b5a44_0 conda-forge
readline 8.0 h7b6447c_0
requests 2.24.0 py_0
send2trash 1.5.0 py38_0
setuptools 47.3.1 py38_0
six 1.15.0 py_0
skein 0.8.0 py38h32f6830_1 conda-forge
snappy 1.1.8 he6710b0_0
sortedcontainers 2.2.2 py_0
sqlite 3.32.3 h62c20be_0
tbb 2020.0 hfd86e86_0
tblib 1.6.0 py_0
terminado 0.8.3 py38_0
testpath 0.4.4 py_0
thrift 0.13.0 py38he6710b0_0
thrift-cpp 0.13.0 h62aa4f2_2 conda-forge
tk 8.6.10 hbc83047_0
toolz 0.10.0 py_0
tornado 6.0.4 py38h7b6447c_1
traitlets 4.3.3 py38_0
typing_extensions 3.7.4.2 py_0
urllib3 1.25.9 py_0
wcwidth 0.2.5 py_0
webencodings 0.5.1 py38_1
wheel 0.34.2 py38_0
xz 5.2.5 h7b6447c_0
yaml 0.2.5 h7b6447c_0
zeromq 4.3.2 he6710b0_2
zict 2.0.0 py_0
zipp 3.1.0 py_0
zlib 1.2.11 h7b6447c_3
zstd 1.4.4 h0b5b093_3
The Hadoop cluster is running version Hadoop 2.7.0-mapr-1607.
The Cluster object is created with:
# Create a cluster where each worker has two cores and eight GiB of memory
cluster = YarnCluster(
environment='conda-env-packed-for-worker-nodes.tar.gz',
worker_env={
# See https://github.com/dask/dask-yarn/pull/30#issuecomment-434001858
'ARROW_LIBHDFS_DIR': '/opt/mapr/hadoop/hadoop-0.20.2/c++/Linux-amd64-64/lib',
},
)
Suspected Cause
I suspect that the version mismatch between the hadoop-0.20.2 in the ARROW_LIBHDFS_DIR environmental variable and the hadoop CLI version Hadoop 2.7.0 might be a cause of the problem.
I had to manually specify pyarrow to use this file (using this setup: https://stackoverflow.com/a/62749053/1147061). The required file, libhdfs.so, is not provided under /opt/mapr/hadoop/hadoop-2.7.0/. Installing libhdfs3 via conda install -c conda-forge libhdfs3 does not resolve the requirement, either.
Might this be the problem?
(a part answer)
To use libhdfs3 (which is poorly maintained these days), you would need to call
dd.read_csv('hdfs://file.csv', storage_options={'driver': 'libhdfs3'})
and, of course, install libhdfs3. This will not help with the hadoop library option, they are independent code paths.
I also suspect that getting the JNI libhdfs (without the "3") working is a case of locating the right .so file.

How to install/update to sympy 1.4 under in Latest Anancoda 2019.03?

Update
Thanks for the hint by #cel below, the command to use is
>sudo conda install sympy=1.4
## Package Plan ##
environment location: /opt/anaconda
added / updated specs:
- sympy=1.4
The following packages will be downloaded:
package | build
---------------------------|-----------------
sympy-1.4 | py37_0 9.7 MB
------------------------------------------------------------
Total: 9.7 MB
The following packages will be REMOVED:
anaconda-2019.03-py37_0
The following packages will be UPDATED:
sympy 1.3-py37_0 --> 1.4-py37_0
Proceed ([y]/n)? y
Verified OK after installation:
>python
Python 3.7.3 (default, Mar 27 2019, 22:11:17)
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import sympy
>>> sympy.__version__
'1.4'
>>>
I have no idea why other commands did not work. But the above works.
Original question
I am using Linux Manjaro 64 bit. Installed latest Anancoda
>which python
/opt/anaconda/bin/python
>conda list anaconda
# packages in environment at /opt/anaconda:
#
# Name Version Build Channel
anaconda 2019.03 py37_0
anaconda-client 1.7.2 py37_0
anaconda-navigator 1.9.7 py37_0
anaconda-project 0.8.2 py37_0
The problem is that it comes with sympy 1.3, while latest sympy is 1.4 accoding to
https://github.com/sympy/sympy/releases
sympy 1.4 has been out 3 weeks ago.
Now doing
>sudo conda update sympy
Does not update. It says
## Package Plan ##
environment location: /opt/anaconda
added / updated specs:
- sympy
The following packages will be downloaded:
package | build
---------------------------|-----------------
ca-certificates-2019.1.23 | 0 126 KB
certifi-2019.3.9 | py37_0 155 KB
conda-4.6.14 | py37_0 2.1 MB
openssl-1.1.1b | h7b6447c_1 4.0 MB
sympy-1.3 | py37_0 9.5 MB
------------------------------------------------------------
Total: 15.9 MB
But according to https://anaconda.org/anaconda/sympy it says sympy 1.4 is available
I also tried the command above, and it does not update sympy
>sudo conda install -c anaconda sympy
## Package Plan ##
environment location: /opt/anaconda
added / updated specs:
- sympy
The following packages will be downloaded:
package | build
---------------------------|-----------------
ca-certificates-2019.1.23 | 0 126 KB anaconda
certifi-2019.3.9 | py37_0 155 KB anaconda
conda-4.6.14 | py37_0 2.1 MB anaconda
openssl-1.1.1b | h7b6447c_1 4.0 MB anaconda
sympy-1.3 | py37_0 9.5 MB anaconda
------------------------------------------------------------
Total: 15.9 MB
Any one knows why sympy is not being updated? Anything else to try? I could download the tar file from sympy 1.4, but I do not know what to do after that in order to install it in Anancoda.
When conda update does not want to update a package, you can ask conda explicity to install a specific version: conda install sympy=1.4.

The installed version of WinRM does not support transport(s) [u'']

I'm trying to configure a bunch of Windows machines through Ansible and installed pywinrm correctly but Ansible is throwing an error message.
Command:
ansible-playbook -i inventory/hosts playbooks/common.yml -vvv
Error Message:
ASK [Gathering Facts] **************************************************************************************************************
task path: /media/sf_C_DRIVE/OnlyOnMyPC/ansible.rhel75.master/playbooks/common.yml:1
Using module file /home/jim/.local/lib/python2.7/site-packages/ansible/modules/windows/setup.ps1
fatal: [10.96.1.11]: FAILED! => {
"msg": "The installed version of WinRM does not support transport(s) [u'']"
}
Using module file /home/jim/.local/lib/python2.7/site-packages/ansible/modules/windows/setup.ps1
fatal: [10.96.1.12]: FAILED! => {
"msg": "The installed version of WinRM does not support transport(s) [u'']"
}
List of packages installed by pip are as below:
pip list pywinrm
Package Version
------------------------------ ----------
adal 1.2.0
ansible 2.7.4
ansible-lint 3.5.1
applicationinsights 0.11.7
argcomplete 1.9.4
asn1crypto 0.24.0
azure-cli-core 2.0.35
azure-cli-nspkg 3.0.2
azure-common 1.1.11
azure-graphrbac 0.40.0
azure-keyvault 1.0.0a1
azure-mgmt-batch 4.1.0
azure-mgmt-compute 2.1.0
azure-mgmt-containerinstance 0.4.0
azure-mgmt-containerregistry 2.0.0
azure-mgmt-containerservice 3.0.1
azure-mgmt-dns 1.2.0
azure-mgmt-keyvault 0.40.0
azure-mgmt-marketplaceordering 0.1.0
azure-mgmt-monitor 0.5.2
azure-mgmt-network 1.7.1
azure-mgmt-nspkg 2.0.0
azure-mgmt-rdbms 1.2.0
azure-mgmt-resource 1.2.2
azure-mgmt-sql 0.7.1
azure-mgmt-storage 1.5.0
azure-mgmt-trafficmanager 0.50.0
azure-mgmt-web 0.32.0
azure-nspkg 2.0.0
azure-storage 0.35.1
backports.ssl-match-hostname 3.5.0.1
bcrypt 3.1.4
certifi 2018.11.29
cffi 1.11.5
chardet 3.0.4
colorama 0.4.1
configobj 4.7.2
configparser 3.5.0
cryptography 2.4.2
decorator 3.4.0
entrypoints 0.2.3
enum34 1.1.6
humanfriendly 4.17
idna 2.7
iniparse 0.4
ipaddress 1.0.16
isodate 0.6.0
javapackages 1.0.0
Jinja2 2.10
jmespath 0.9.3
keyring 17.0.0
knack 0.3.3
lxml 3.2.1
MarkupSafe 1.1.0
monotonic 1.5
msrest 0.4.29
msrestazure 0.4.31
oauthlib 2.1.0
packaging 18.0
paramiko 2.4.2
perf 0.1
pip 18.1
pyasn1 0.4.4
pycparser 2.19
pycurl 7.19.0
Pygments 2.3.0
pygobject 3.22.0
pygpgme 0.3
PyJWT 1.7.0
pyliblzma 0.5.3
PyNaCl 1.3.0
pyOpenSSL 18.0.0
pyparsing 2.3.0
python-dateutil 2.7.5
python-linux-procfs 0.4.9
pyudev 0.15
pywinrm 0.3.0
pyxattr 0.5.1
PyYAML 3.13
requests 2.20.1
requests-oauthlib 1.0.0
schedutils 0.4
SecretStorage 2.3.1
setuptools 0.9.8
six 1.11.0
slip 0.4.0
slip.dbus 0.4.0
tabulate 0.8.2
typing 3.6.6
urlgrabber 3.10
urllib3 1.24.1
wheel 0.30.0
xmltodict 0.11.0
yum-metadata-parser 1.1.4
No idea what is the problem and how to resolve it.
Any assistance would be greatly appreciated.
Thank you very much.
It would appear you are not setting ansible_winrm_transport: or, worse, you have set it to the empty string.

"conda list" will correctly identify pip installed gunicorn, but "conda list -e" option it does not

Windows 10
Latest Anaconda3 build
I created a virtual environment using conda.
I activited this environment (BTW, you must use CMD, as it will not switch if you're using powershell)
Now, gunicorn package cannot be installed using conda (unkown package), therefore I've needed to use pip. Pip has installed the package successfully.
When I do "conda list" I see the package:
# packages in environment at C:\Anaconda3\envs\HerokuApp:
#
click 6.6 py35_0
flask 0.11.1 py35_0
gunicorn 19.6.0 <pip>
itsdangerous 0.24 py35_0
jinja2 2.8 py35_1
markupsafe 0.23 py35_2
mkl 11.3.3 1
nltk 3.2.1 py35_0
numpy 1.11.1 py35_0
pandas 0.18.1 np111py35_0
pip 8.1.2 py35_0
python 3.5.1 5
python-dateutil 2.5.3 py35_0
pytz 2016.4 py35_0
scikit-learn 0.17.1 np111py35_1
scipy 0.17.1 np111py35_1
setuptools 23.0.0 py35_0
six 1.10.0 py35_0
vs2015_runtime 14.0.25123 0
werkzeug 0.11.10 py35_0
wheel 0.29.0 py35_0
But when I run "conda list -e" for output to a requirement.txt file (for Heroku), the package is not listed
# This file may be used to create an environment using:
# $ conda create --name <env> --file <this file>
# platform: win-64
click=6.6=py35_0
flask=0.11.1=py35_0
itsdangerous=0.24=py35_0
jinja2=2.8=py35_1
markupsafe=0.23=py35_2
mkl=11.3.3=1
nltk=3.2.1=py35_0
numpy=1.11.1=py35_0
pandas=0.18.1=np111py35_0
pip=8.1.2=py35_0
python=3.5.1=5
python-dateutil=2.5.3=py35_0
pytz=2016.4=py35_0
scikit-learn=0.17.1=np111py35_1
scipy=0.17.1=np111py35_1
setuptools=23.0.0=py35_0
six=1.10.0=py35_0
vs2015_runtime=14.0.25123=0
werkzeug=0.11.10=py35_0
wheel=0.29.0=py35_0
Does anyone know what/why this is?
Thanks
I'm not sure exactly on the what/why but if you use conda env export you'll get the pip installed packages too. Not in the requirement.txt format but it could be useful.
BTW I do see a py3.5 version of gunicorn for windows over here if that helps. It is only version 19.3.0 but you can conda install it via conda install -c phumke gunicorn=19.3.0.

Resources