I am a student at an university that maintains an internal mail server(zimbra). All the important mails are delivered to this account. But the fact is that no one bothers to check their mail. I am writing this bash script that notifies the user when a new mail is delivered to the inbox.I have figured out a way to find the number of unread mails from the page that is returned after the login(post method)[the title has the unread mail count,so it is just a few "greps" and "seds" away from the number].....
I guess zimbra suite has an api,but I have just started to learn python so I have no other choice but to rely on curl. The form is as follows(i've filtered out the input tags)
<input type="hidden" name="loginOp" value="login"/>
<input id="username" class="zLoginField" name="username" type="text" value="" />
<input id="password" class="zLoginField" name="password" type="password" value=""/>
<td><input id="remember" value="1" type="checkbox" name="zrememberme" /></td>
<td><input type="submit" class="zLoginButton" value="Log In"/></td>
We need to use a proxy server to connect to the internet.
And the https connection is not secure,,so the -k
I am not sure about the last line but I managed to form this
curl -A 'Mozilla/5.0 (X11; Linux i686; rv:10.0) Gecko/20100101 Firefox/10.0' -c cookies.txt -k -x 10.1.1.26:8080 -d "loginOp=login&username=xxxxx&password=xxxxx&zrememberme=1&Log+In" https://warrior.bits-goa.ac.in/zimbra/?zinitmode=http > 2.html
The "Log In" button doesn't have a name. That's weird! (ok! I accept I have never come across a similar form)
Or is it the cookie? The curl manual recommends the netscape cookie.
The output file is the same as the login page but with an added "Your browser doesn't support cookies" line. What do I do??
Also there is this wonderful addon called tamperdata for firefox. I am not able to install it cause for some reason the net admin has blocked the link to download the addon.[ I don't trust the proxy sites ;) ].Can someone post the post data with a random login on this site.
http://warrior.bits-goa.ac.in
Thanks! And sorry for the elaborate question.I want to learn this once and for all :)
curl is definitely (one of) the right tool to use: you just need a --cookie (and maybe a glass of milk)
initialize the cookie with -b and use it with -c. Here is an example where I used it for dropbox:
http://murga-linux.com/puppy/viewtopic.php?p=597711#597711
and where I learned the curl commands for it:
http://curl.haxx.se/docs/httpscripting.html
Related
I am upgrading sagepay(opayo) direct from 2.23 to 4.0. After upgrade 3d secure page is not opening in live mode.
I have used sagepay direct mode. After giving the card details got the "Status=3DAUTH". Then i redirect to another page where 3d secure page will open in an iframe.
In Test mode checking:
After full implementation i have tested in test mode. In test mode "challan" page came up successfully and its successfully authenticated. And payment successful.
Test In live Mode:
When i test in live mode with soldo virtual card its successfully open the 3d secure page. After approve the payment from soldo app, payment is successfull. And its done.
Now when i try amex card it shown 3d secure page loading screen and then given below error.
Oops ! An error occurred!!!
Internal processing Error..!!!
Now i have tried to do payment in master card. In this case when i redirect to 3d secure page. No thing is showing in iframe. And its stuck.
I have sending below params when trying to 1st call
$strPost = $strPost . "&Apply3DSecure=0";
$strPost = $strPost . "&AccountType=E";
$strPost = $strPost . "&BrowserAcceptHeader=text/html,application/xhtml+xml,application/xml&BrowserColorDepth=24&BrowserJavaEnabled=1&BrowserJavascriptEnabled=1&BrowserLanguage=en-GB&BrowserScreenHeight=1080&BrowserScreenWidth=1920&BrowserTZ=%2B300&BrowserUserAgent=Mozilla&ChallengeWindowSize=01";
$strPost = $strPost . "&ThreeDSNotificationURL=".$strYourSiteFQDN."3DCalBack.php?pagename=transactionRegistration.php&VendorTxCode=".$strVendorTxCode;
$strPost = $strPost ."&COFUsage=FIRST&InitiatedType=CIT&MITType=UNSCHEDULED";
ACSURL submit form
<form name="form" action="{$ACSURL}?creq=".$strCReq" method="POST">
<input type="hidden" name="PaReq" value="{$strPAReq}"/>
<input type="hidden" name="creq" value="{$strCReq}"/>
<input type="hidden" name="TermUrl" value="{$TermUrl}?VendorTxCode={$strVendorTxCode}"/>
<input type="hidden" name="MD" value="{$strMD}"/>
<input type="hidden" name="VPSTxId" value="{$strVPSTxId}"/>
<input type="hidden" name="mode" value="secure3d"/>
</form>
Please help me get fixed 3d secure page.
First, make sure this is not a Frictionless Authentication. From the docs:
For a frictionless authentication, the 3D Secure scheme has enough
information about the cardholder to provide an instant authentication
result. When the authentication process has completed the transaction
is submitted for authorisation.
I.e there is need to ask for the code and redirect to the 3-D Secure page (or show it in iframe). If necessary, you may use the Apply3DSecure=1 param in your initial request to SagePay, so that 3-D Secure check is always forced.
Second, for the 3-D Secure v2 redirection the list of the form fields should be:
<form action="{$ACSURL}" method="post">
<input type="text" name="ACSTransID" value="{$strACSTransID}" />
<input type="text" name="creq" value="{$strCReq}" />
<input type="text" name="TermUrl" value="{$TermUrl}?VendorTxCode={$strVendorTxCode}" />
</form>
All data should be POST-ed, and you should include the ACSTransID value from the response to your initial request. Note: fields PaReq and MD are related to 3-D Secure v1. If for some reason you would like to keep the backwards compatibility your current integration should be fine. In such case you may check the response for the initial request:
if it contains CReq and ACSTransID fields, then SagePay expects the communication via v2 protocol
if there are PaReq and MD fields, it's v1 protocol
However, 3-D Secure v1 is going to be withdrawn soon on October, 15 2022. So apparently keeping the backwards compatibility is not necessary.
I deployed my laravel application in a shared hosting system and the application works fine, but every time I open the website for the first time in a new browser the first POST request that I make returns error 419, page expired. After that the system works fine, every POST is made correctly but I don't know why always this error happens the first time the system is opened in a new browser. The csrf_token is correct at the code.
<form method="POST">
#csrf
<input class="fields" name="user" placeholder="Email"/><br/>
<input class="fields" name="password" type="password" placeholder="Senha"/><br/>
<input class="entrarbtn" value="Entrar" type="submit"/><br/>
</form>
just added this settings on php.ini and works now
; cPanel-generated php ini directives, do not edit
; Manual editing of this file may result in unexpected behavior.
; To make changes to this file, use the cPanel MultiPHP INI Editor (Home >> Software >> MultiPHP INI Editor)
; For more information, read our documentation (https://go.cpanel.net/EA4ModifyINI)
output_buffering = 16384
I'm trying to get form based authentication to work and came across only 1 reference on the web that indicated the url was j_security_check and the parameters are j_username and j_password. There is no mention of this in any of the quarkus docs.
Am I missing something or is the documentation lacking this critical piece of information?
Thanks
Jeff
Yep, you're totally right about the lack of documentation on that, despite this, I've consolidated some information.
To make the authentication
Create a HTML form which perform the POST to /j_security_check
login.html
<form action="j_security_check" method="post">
<div class="container">
<label for="j_username"><b>Username</b></label>
<input type="text" placeholder="Enter Username" name="j_username" required>
<label for="j_password"><b>Password</b></label>
<input type="password" placeholder="Enter Password" name="j_password" required>
<button type="submit">Login</button>
</div>
</form>
A CURL example
$ curl -i -X POST http://localhost:8080/j_security_check -d 'j_username=admin&j_password=admin'
HTTP/1.1 302 Found
location: http://localhost:8080/index.html
content-length: 0
set-cookie: quarkus-credential=DPeFtSios6kIpWpJw6BpCfId3+MT151H3yPOc5VzfYdrdO6oRcE+dy18IL0KMeFx; Path=/
References about the login mechanism
I didn't found any additional doc explaining the login steps but I did some greps on quarkus source code and found this file which seems to handle the /j_security_check endpoint, looks likes there is no way to customize this path, or the parameters names, yet. This file is present on the dependency quarkus-vertx-http-1.10.5.Final.jar
public class FormAuthenticationMechanism implements HttpAuthenticationMechanism {
private static final Logger log = Logger.getLogger(FormAuthenticationMechanism.class);
public static final String DEFAULT_POST_LOCATION = "/j_security_check";
An interesting fact, a file with the same name, but with a more advanced code can also be found atundertow project, the path can be customized there.
I have some problems with Paypal's ipn.
I have a windows server with IIS, with 2 domains, 1 is in the root folder and the other one is deepest.
www.website.com --> root/index.aspx
www.secondweb.com --> root/website/completed/4/index.aspx
Paypal ipn is verified in both websites because in
https://developer.paypal.com/developer/ipnSimulator/ works and in https://www.paypal.com/cgi-bin/webscr?cmd=_display-ipns-history&nav=0.3.2 works again.
In my code at the moment there isnt the check of package because I get off almost all my code but I can assecurate thats is verified but I still dont understand why in the website of the root works (write in db and log) and in the other doesnt. This is my code.
<%# Page Language="VB" aspcompat=true%>
<%# Import Namespace="MySql.Data.MySqlClient" %>
<%# Import Namespace="System.IO" %>
<%# Import Namespace="System.Web" %>
<%
Dim IdWeb = System.Guid.NewGuid().ToString()
Dim IdUse
Dim Pac
Dim AllPar
Dim IdDis
Dim MonGra
Dim Price
'Listener Payment
Dim objHttp, str
' read post from PayPal system and add 'cmd'
str = Request.Form.ToString() & "&cmd=_notify-validate"
' post back to PayPal system to validate
objHttp = Server.CreateObject("Msxml2.ServerXMLHTTP")
' set objHttp = Server.CreateObject("Msxml2.ServerXMLHTTP.4.0")
' set objHttp = Server.CreateObject("Microsoft.XMLHTTP")
objHttp.open("POST", "https://ipnpb.paypal.com/cgi-bin/webscr", false)
objHttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded")
objHttp.Send(str)
'Creo e apro connessione
Dim Connection As New MySqlConnection("server=localhost;Uid=root;Pwd=***;Database=mb")
Dim Command As New MySqlCommand
Command.Connection = Connection
Connection.Open()
' Esecuzione frase SQL
Command.CommandText = "INSERT INTO WEBSITE (IDWEB) VALUES ('bbb')"
Command.ExecuteNonQuery()
'Chiusura connessione
Connection.Close()
using fs = new FileStream(HttpContext.Current.Request.MapPath("log.txt"),FileMode.Append, FileAccess.Write)
using swr = new StreamWriter(fs)
swr.Write(DateTime.Now.ToString("yyyy/MM/dd HH:mm:ss")+" ")
swr.Write("hello")
swr.Write(Environment.NewLine)
end using
end using
%>
So I tried to foucs on the configuration about certificate SSL, I think you are right, the problem has to be there cause the rest works and it's the same for both website.
This is my configuration
https All unassigned 443
www.website.com
Require server name indication
ssl certificate
website.com
https All unassigned 443
www.secondweb.com
Require server name indication
ssl certificate
secondweb.com
I have this warning
No default SSL site has been created. To support browsers without SNI
capabilities, it is recommended to create a default SSL site.
Still the same about my issues, Could be for this warning that still doesnt work?
I have read this
One thing to note with implementing SNI for your SSL solution, it will
not work for those users running Internet Explorer on Windows XP
They do both the same request
<form id="payment" action="https://www.paypal.com/cgi-bin/webscr" method="post">
<input type="hidden" name="cmd" value="_xclick-subscriptions">
<input type="hidden" name="business" value="***">
<input type="hidden" name="item_name" value="MB Website's Production">
<input type="hidden" name="no_shipping" value="1">
<input type="hidden" name="a1" value="0">
<input type="hidden" name="p1" value="1">
<input type="hidden" name="t1" value="M">
<input type="hidden" name="a3" value="0.01">
<input type="hidden" name="p3" value="1">
<input type="hidden" name="t3" value="M">
<input type="hidden" name="src" value="1">
<input type="hidden" name="currency_code" value="EUR">
<input type="hidden" name="custom" value="Mirco---completed------1---0.01">
<input type="submit" value="EnvĂa">
</form>
The only difference between each other is the position of the websites.
Maybe paypal ipn has some problems with sni?
Paypal ipn is verified in both websites because in
https://developer.paypal.com/developer/ipnSimulator/ works and in
https://www.paypal.com/cgi-bin/webscr?cmd=_display-ipns-history&nav=0.3.2
works again.
All this means is that the IPN message was sent, and the endpoint responded with an HTTP 200 OK.
What happens next is what's failing. Your code tries to connect to the https://ipnpb.paypal.com/cgi-bin/webscr?cmd=_notify-validate , which is the first thing that may be failing. You need to log all possible errors within your code and debug this. That HTTPS connection may be failing. This may be due to the TLS 1.2 requirement, or it may be due to you not trusting the certificate authority that trusts the issuer of the SSL certificate of that PayPal server endpoint.
And assuming all that is working correctly and you receive a 'VALID' response from the ipn postback endpoint, other problems may be happening later on in your code.
You need to do your own runtime debugging from your environment; we can't debug this code running on your system for you.
Finally it works!!!
I had to put my ipn page in the root
root/paypal_ipn.aspx
Before I had one ipn page there for the main website and another one for the second website, now I manage everything from just one in the root and everything works!
Thank you
I have a really simple webform;
<form name="logmeinsupport" action="https://secure.logmeinrescue.com/Customer/Code.aspx" method="post">
<span>Enter your 6-digit PIN code: </span><input type="text" name="Code" /><br />
<input type="submit" value="Connect to technician" />
</form>
On the logmeinrescue side, there will post the error code back into URL:
Like: http://tomtom-uk--tst2.tomtom.com/app/utils/support_login?LogMeInRescueResponse=PINCODE_INVALID
However, our server setup interprets it as a page in web server, therefore, i got 404 error:
404 Page Not Found
The page 'utils/support_login?LogMeInRescueResponse=PINCODE_INVALID.php' was not found.
It is not so easy for our infrastructure team to change it. I am wondering whether I can use Ajax call backs post function to get response. And i think the difficult is about cross server ajax call.
Any examples? Thanks
Cheers,
Qing
The server looks for the file 'utils/support_login?LogMeInRescueResponse=PINCODE_INVALID.php'.
I think you should change your request to:
utils/support_login.php?LogMeInRescueResponse=PINCODE_INVALID
In this way the server will look for support_login.php and if there is indeed a file with this name on your server you will not have this error again.