I am using Friend for authentication/authorization in a Clojure Ring application. I am trying to persist session data into a cookie via 'Remember Me' functionality, so that it can survive e.g. server restarts. My handler definition is:
(def secured-routes
(-> app-routes
(friend/authenticate friend-param-map)
(wrap-defaults (-> site-defaults
(assoc-in [:security :anti-forgery] false)
(assoc :session {:store (cookie-store {:key "16-byte-secret"})
:cookie-name "TestCookie"
:cookie-attrs {:max-age 1800}})
(assoc :cookies true)))
wrap-json-params))
What else do I need to write to make it work? Do I need to create a cookie first in one of the app-routes handlers?
Thanks!
I guess you are already aware of this issue. Anyway as far as I recall your best chance is to use wrap-session
Something like this should work (sorry not tested)
(def secured-routes
(-> app-routes
(friend/authenticate friend-param-map)
(wrap-session {:cookie-attrs {:max-age 3600} :cookie-name "TestCookie" } )
(wrap-defaults (-> site-defaults
(assoc-in [:security :anti-forgery] false)))
wrap-json-params))
Related
It`s a continuation of my previous question How to produce a lazy sequence by portion in clojure?
I want download data from a database by portions. Initially I download first 500 rows and then I send a request to fetch next 500 rows and so on until I receive all data from a server.
I wrote the code:
(jdbc/atomic conn
(with-open [cursor (jdbc/fetch-lazy conn [sql_query])]
(let [lazyseq (jdbc/cursor->lazyseq cursor)
counter (atom 1)]
(swap! lazyseq_maps assoc :session_id {:get_next? (chan 1) :over_500 (chan 1) :data []})
(>!! (:get_next? (:session_id #lazyseq_maps)) true)
(go
(doseq [row lazyseq]
(swap! counter inc)
(when (<! (:get_next? (:session_id #lazyseq_maps)))
(swap! lazyseq_maps update-in [:session_id :data] conj row)
(if (not= 0 (mod #counter 500))
(>! (:get_next? (:session_id #lazyseq_maps)) true)
(>! (:over_500 (:session_id #lazyseq_maps)) true))))
;
(close! (:get_next? (:session_id #lazyseq_maps)))
(close! (:over_500 (:session_id #lazyseq_maps)))
(.close conn))
(when (<!! (:over_500 (:session_id #lazyseq_maps))) {:message "over 500 rows"
:id :session_id
:data (:data (:session_id #lazyseq_maps))}))))
I fetch rows with help of the doseq cycle. When doseq passed 500 rows I park the cycle (when (<! (:get_next? (:session_id #lazyseq_maps))) and wait for a signal from outside to retrieve next 500 rows.
But here I have a problem. When I send the signal, the program throws error "Resultset is closed". I.e connection is closed outside with-open scope. But I don`t understand why, because go block is placed inside with-open scope. Can you help me solve the problem?
(go ...) returns immediately, therefore, so does (with-open ...).
You may want to do it the other way around:
(go (with-open ...))
However, do note that this process will hold on to a database connection (a scarce resource!) for a potentially very long time, which may not be desirable, and kind of goes against the benefit of having 'lightweight' threads thanks to go blocks. Here are some alternatives to consider:
Maybe you could re-open a database connection for each batch?
Maybe you could eagerly stream the whole results set to an external store (e.g AWS S3) and have the client poll against that?
Unless you are on a seriously memory constrained system, I would recommend just load all rows at once to RAM and close the DB connection. Otherwise your complete solution will likely be very complex and difficult to test and reason about.
If you have tens of millions of rows maybe you could fetch them in some partitions?
I use url.el to create http request by url-retrieve-synchronously. And when url is correct all is fine.
Code example:
(with-current-buffer (url-retrieve-synchronously my-url)
(hoge--log-debug "\n%s"(buffer-string)))
But:
How I can handle http response when url is incorrect.
e.g. "http://2222httpbin.org/xml" (Unknown host)?
How I can get http status response?
Apparently url-retrieve-synchronously only returns a valid buffer or nil. I don't think you can retrieve the status. Your best option is to call url-retrieve which allows you to pass a callback function, where you can access all the details.
Workaround
Looking at the processes given by list-processes, it appeared that the background process was still hanging there. Deleting it would call the callback function. So we only need to delete it ourselves when we now the process failed:
(defun url-retrieve-please (url callback)
(apply callback
(block waiter
(let ((process
(get-buffer-process
(url-retrieve url (lambda (&rest args)
(return-from waiter args))))))
;; We let a chance for the connection to establish
;; properly. When it succeeds, the callback will return
;; from the waiter block.
;; When it fails to connect, we exit this loop.
(loop until (eq 'failed (process-status process))
;; sitting leaves a chance for Emacs to handle
;; asynchronous tasks.
do (sit-for 0.1))
;; Deleting the process forces the above lambda callback
;; to be called, thanks to the process sentinel being in
;; place. In the tests, we always exit from the above
;; callback and not after the block normally exits. The
;; behaviour seems quite regular, so I don't sleep
;; forever after this command.
(delete-process process)))))
Tests
(url-retrieve-please "http://yahoo.com" (lambda (&rest args) (message "%S" args)))
"((:redirect \"https://www.yahoo.com/\" :peer (:certificate (:version 3 :serial-number \"1c:25:43:0e:d0:a6:02:e8:cc:3a:97:7b:05:39:cc:e5\" :issuer \"C=US,O=Symantec Corporation,OU=Symantec Trust Network,CN=Symantec Class 3 Secure Server CA - G4\" :valid-from \"2015-10-31\" :valid-to \"2017-10-30\" :subject \"C=US,ST=California,L=Sunnyvale,O=Yahoo Inc.,OU=Information Technology,CN=www.yahoo.com\" :public-key-algorithm \"RSA\" :certificate-security-level \"Medium\" :signature-algorithm \"RSA-SHA256\" :public-key-id \"sha1:47:16:26:79:c6:4f:b2:0f:4b:89:ea:28:dc:0c:41:6e:80:7d:59:a9\" :certificate-id \"sha1:41:30:72:f8:03:ce:96:12:10:e9:a4:5d:10:da:14:b0:d2:d4:85:32\") :key-exchange \"ECDHE-RSA\" :protocol \"TLS1.2\" :cipher \"AES-128-GCM\" :mac \"AEAD\")))"
(url-retrieve-please "http://2222httpbin.org/xml" (lambda (&rest args) (message "%S" args)))
"((:error (error connection-failed \"deleted
\" :host \"2222httpbin.org\" :service 80)))"
To retrieve the status code, use url-http-symbol-value-in-buffer on the obtained buffer.
Example:
(url-http-symbol-value-in-buffer 'url-http-response-status
(url-retrieve-synchronously "http://httpbin.org/get"))
I tried for few days, I am a little confused here.
I am using clojure http-kit to make keepalive get request.
(ns weibo-collector.weibo
(:require [org.httpkit.client :as http]
[clojure.java.io :as io]))
(def sub-url "http://c.api.weibo.com/datapush/status?subid=10542")
(defn spit-to-file [content]
(spit "sample.data" content :append true))
#(http/get sub-url {:as :stream :keepalive 3000000}
(fn [{:keys [status headers body error opts]}]
(spit-to-file body)
))
I am pretty sure that I made a persistent connection to target server, but nothing written to sample.data file.
I tried as stream and as text.
I also tried ruby version the program create a persistent connection either, still nothing written.
So typically, the target will use webhook to notify my server new data is coming, but how to I get data from the persistent connection?
---EDIT---
require 'awesome_print'
url = "http://c.api.weibo.com/datapush/status?subid=10542"
require "httpclient"
c = HTTPClient.new
conn = c.get_async(url)
Thread.new do
res = conn.pop
while true
text = ""
while ch = res.content.read(1)
text = text+ch
break if text.end_with? "\r\n"
end
ap text
end
end
while true
end
Above is a working example using ruby, it uses a thread to read data from the connection. So I must miss something to get the data from clojure
I have a classified clojure web app I want to host on Heroku. The domain is registered at Godaddy.
What would be the most effective and efficient way to have multiple subdomains:
newyork.classapp.com
montreal.classapp.com
paris.classapp.com
...
Users, all logic, should be shared across subdomains, so I'd love to have only one code base.
It would be easy to redirect subdomains to first level folder like this:
paris.classapp.com -> classapp.com/paris/
But I want users to keep seeing the subdomain while browsing the site, like this:
paris.classapp.com/cars/blue-car-to-sell
As opposed to this:classapp.com/paris/cars/blue-car-to-sell
What should I do?
Heroku support wildcard subdomains: https://devcenter.heroku.com/articles/custom-domains#wildcard-domains.
You will have in the host header the original domain, which you can use with something like (completely untested):
(GET "/" {{host "host"} :headers} (str "Welcomed to " host))
You could also create your own routing MW (completely untested):
(defn domain-routing [domain-routes-map]
(fn [req]
(when-let [route (get domain-routes-map (get-in req [:headers "host"]))]
(route req))))
And use it with something like:
(defroutes paris
(GET "/" [] "I am in Paris"))
(defroutes new-new-york
(GET "/" [] "I am in New New York"))
(def my-domain-specific-routes
(domain-routing {"paris.example.com" paris "newnewyork.example.com" new-new-york}))
And yet another option is to create a "mod-rewrite" MW that modifies the uri before getting to the Compojure routes:
(defn subdomain-mw [handler]
(fn [req]
(let [new-path (str (subdomain-from-host (get-in req [:headers "host"])
"/"
(:uri req))]
(handler (assoc req :uri new-path))))
(defroutes my-routes
(GET "/:subdomain/" [subdomain] (str "Welcomed to " subdomain))
Pick the one that suits your requirements.
I want different handlers to set different keys in the session without affecting each other. I'm working from this wiki article, which advises using assoc. I thought I could use assoc-in to update a path in the session.
(defn handler-one
[request]
(prn "Session before one" (:session request))
(-> (response "ONE")
(content-type "text/plain")
(#(assoc-in % [:session :key-one] "one"))))
(defn handler-two
[request]
(prn "Session before two" (:session request))
(-> (response "TWO")
(content-type "text/plain")
(#(assoc-in % [:session :key-two] "two"))))
If I call handler-one repeatedly it prints Session before one {:key-one "one"} and likewise handler-two prints the previous session values.
By setting a session key using assoc-in I would expect both keys to be set, i.e. {:key-one "one" :key-two "two"}. But it appears that the entire session dictionary is replaced.
Am I doing this wrong?
You're printing the session in request, but you're assoc'ing on the (nonexistent) session in response so you end up with a session with only the last added property. You should get the session out of request, assoc into that and then return the new session as part of the response.