Using Libcurl to authenticate ntlm proxy without pass - windows

i'm testing some network simple process to understand better and know how to work with NTLM.
Following this (ntlm-proxy-without-password) Q&A i found hot to uthenticate my transaction via ntml using the log information of the current user.
The command is this: curl.exe -U : --proxy-ntlm --proxy myproxy.com:8080 http://www.google.com
Know i have to do the same thing using libcurl since i need to achieve that result into the application i'm developing. There is a way to do this?
Following this Q&A i found hot to

This solved like a charm
curl_easy_setopt(ctx, CURLOPT_PROXYUSERPWD, ":");

Related

curl 1020 error when trying to scrape page using bash script

I'm trying to write a bash script to access a journal overview page on SSRN.
I'm trying to use curl for this, which works for me on other webpages, but it returns error code: 1020 for me if I try to run the following codes:
curl https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1925128
I thought it might have to do with the question mark in the URL, but I got it to work with other pages that contained question marks.
It probably has something to do with what the page's allows to do. However, I can also access the page using R's rvest package, so I think it should work in general also using bash.
Looks like the site has blocked access via curl. Change the user agent and it should work fine i.e.
curl --user-agent 'Chrome/79' "https://papers.ssrn.com/sol3/papersstract_id=1925128"

Curl or Lynx scripting with Chrome Cookie

Just looking for someone to point me in the right direction. I need to script an interaction with a site that uses a "trust this device" cookie, and uses a log in portal. I found the cookie in Chrome, but not sure what to do next. This will be hosted on a CentOS 7 system.
After authenticating to the login portal, I need to access another page using the "trust this device" cookie and the session cookie so I can download files. Manually downloading files everyday gets tedious, and the owner of the site does not want to use SFTP.
Update 1 :
There was some confusion in my request (I could have made it more clear), I am NOT looking for someone to "write code" for me. This is more a sanity check as I learn how this process works. Please simply point me in the right direction as far as tools and general procedure.
Update 2 :
Using the "Copy as curl" option found in most web browsers, I was able to get the correct header information needed for authenticating.
Instead of
curl -b "xxx=xxx"
I needed
curl -H "Cookie: XXXX="%"2Fwpsnew; xxx=xxx"
When adding the -c switch, I can now save the session cookie. Further testing is needed, but at least there is progress.
EDIT
Using the Chrome feature for copying curl commands from the history (this is found in Firefox as well), I was able to partially reproduce results. However in my case I was not able to log in as the site I was working with uses additional js that modifies the cookies.
This initial question can be closed, I will open a new post for more specific parts of my project.

Debugger configuration in GoGland

I want to debug my go application when I send request using curl command.
Currently my request is handled by the binary I have.
What I want that when I send request using curl command request should be handled by the code I have not by the binary.
I did not find any documentation about it, only found this thiswhich is still unanswered.
#Zoyd did you found a way to configure it?
I've made a short video of how debugging works in Gogland and how it should be configured: https://youtu.be/tT0Op-DYs4s.
In place of the Println you can have your usual api handler and then just run the curl command against your API as usual, as long as you run this with the debugging configuration.

Linkedin: image not showing up after sharing via REST Api

I shared a post from an application but the image doesn't show up in the update. I realise this question is asked a couple of times already and I checked all the info I could find here, to no avail.
I checked content type, which is correct.
I checked the url, which works.
I checked the ssl certificate which seems to be fine.
The image in question is: https://soworker.com/files/images/608-3XZcFrnujD_LN.JPG
The share is being made using the REST Api and the share itself gives a success message.
How can I debug why the image is not showing up? Am I missing something?
Thanks in advance for your response.
I am having this same issue. When I post this url via the API - https://blog.calevans.com/2016/05/16/postcards-life-010/ - it will not show the image.
However, when I post this one - http://voicesoftheelephpant.com/2016/05/17/interview-amanda-folson/ - it works.
Since they both sit on the same server and are both running the same software, my current theory is that LinkedIn can't read images from secure servers. Alternatively, however, less likely, it may be that they won't read from sites using Let's Encrypt images.
UPDATE: It seems to be Let's Encrypt. The podcast is also available encrypted but not using a Let's Encrypt cert because Apple won't read them. I posted a second update using the https:// version and it worked.
So it LOOKS like LinkedIn doesn't like Let's Encrypt.
HTH,
=C=
Apparently LinkedIn didn't like Lets Encrypt certificates. The problems is completely resolved now though and I don't think this will be a certificate issue anymore.

MATLAB - Using urlwrite with https Site Not Working in OS X [duplicate]

Anybody know if it's possible?
I'm trying to get the data by using the following code
url = 'https://cgwb.nci.nih.gov/cgi-bin/hgTracks';
params = {'org','Human','db','hg18','position','EGFR'};
urltxt = urlread(url,'get',params);
but get the error
??? Error using ==> urlread at 111
Error downloading URL. Your network connection may be down or your proxy settings improperly configured.
If I substitute https to http, it works, but I get "301 Moved Permanently" page with the above https-link.
The link in browser works properly in both cases (redirecting http request). The site does not require any authentication.
May be there are other ways than urlread?
Sorry, I found the answer on SO:
Handling an invalid security certificate using MATLAB's urlread command
Will test and remove if needed.
UPDATE:
It really works. DO you think I should delete the question?
An alternative solution that worked for me:
(ps.: I'm using Fedora Linux, Matlab 2017a. Not sure if it will work in a simply way for windows users).
The following command line in Matlab that I used to get the data as:
AllDataURL=urlread('https://bittrex.com/api/v1.1/public/getmarketsummaries');
was successfully replaced by the following command line:
[status,AllDataURL]=dos('curl https://bittrex.com/api/v1.1/public/getmarketsummaries');
Although the result value for the variable 'status' is zero, the data into variable AllDataURL is exactly the same as the previous one when using urlread.
Hope it helps.

Resources