The URL package contains code to parse and handle URLs. It was originally part of w3, the web browser written entirely in EmacsLisp. Now it is a separate library available in Emacs 22.
It supports the following URI schemes: cid, data, dav (webdav, vc-dav), file, ftp, http, https, imap, info, irc, ldap, mailto, man, news, nfs, rlogin, snews, telnet, tn3270. You can add your own functions to deal with certain schemes. For example, JabberEl defines ‘url-xmpp’
.
With its file name handlers enabled (see ‘url-handler-mode’
or ‘url-setup-file-name-handlers’
if you’re a version of URL that’s older than the one in Emacs 22) you can use functions like ‘find-file’
and ‘file-exists-p’
on URLs.
There are also functions to deal with URL strings. ‘url-generic-parse-url’
splits a URL into a vector of all its parts. If you want to encode or decode URLs with hex code (like %20 for “ “) you can use ‘url-hexify-string’
and ‘url-unhex-string’
, respectively. These functions can handle multibyte (although not as well in older versions).
You can use URL to download web pages - it works in the background dealing with all the HTTP codes, authorization, cookies, etc. Useful functions are ‘url-retrieve-synchronously’
which returns the buffer containing the server’s response and ‘url-retrieve’
which lets you run a callback function after the page has been retrieved. See also ‘browse-url-emacs’
if you just want to load a URL into a buffer (opens in another window).
We are sending a response to a form. Define the following functions:
(defun my-url-http-post (url args) "Send ARGS to URL as a POST request." (let ((url-request-method "POST") (url-request-extra-headers '(("Content-Type" . "application/x-www-form-urlencoded"))) (url-request-data (mapconcat (lambda (arg) (concat (url-hexify-string (car arg)) "=" (url-hexify-string (cdr arg)))) args "&"))) ;; if you want, replace `my-switch-to-url-buffer' with `my-kill-url-buffer' (url-retrieve url 'my-switch-to-url-buffer))) (defun my-kill-url-buffer (status) "Kill the buffer returned by `url-retrieve'." (kill-buffer (current-buffer))) (defun my-switch-to-url-buffer (status) "Switch to the buffer returned by `url-retreive'. The buffer contains the raw HTTP response sent by the server." (switch-to-buffer (current-buffer)))
And then you can do the following:
(my-url-http-post "http://localhost/test.cgi" '(("post" . "1") ("text" . "just a test")))
See also Lisp:http-post-simple.el.
NB In the my-url-http-post function above, I had better luck using w3m-url-encode-string rather than url-hexify-string. - Geoff
GET is similar, but does not use the `url-request-data` parameter, so the query string should be sent in concatenated to the URL you’re GETting.
(defvar fb-url "https://www.googleapis.com/freebase/v1/search") (defun fbquery (type str) (let ((url-request-method "GET") (arg-stuff (concat "?query=" (url-hexify-string str) "&filter=" (url-hexify-string type)))) (url-retrieve (concat fb-url arg-stuff) (lambda (status) (switch-to-buffer (current-buffer))))))
When you retrieve pages from websites, url-cookie’s default behavior is to set all the cookies in the HTTP Set-Cookie headers it receives (as long as the domain value matches the host, has enough dots, etc). You can customize the variables ‘url-cookie-confirmation’
, ‘url-cookie-trusted-urls’
, and ‘url-cookie-untrusted-urls’
to control whether or not to allow cookies. Cookies from trusted URLs will be allowed all the time, and cookies from untrusted URLs won’t ever be set. If a URL is either trusted or untrusted, you won’t be asked for confirmation.
url-cookie checks each RegularExpression listed in ‘url-cookie-trusted-urls’
and ‘url-cookie-untrusted-urls’
to see if it matches the current URL. If both a trusted and untrusted regexp match the requested URL somehow, the match that is more complete (determined by testing (- (match-end 0) (match-beginning 0))
) will be used. Note that if a trusted and an untrusted regular expression have the same match value the cookie will be rejected.
To reject all cookies (see also ‘url-privacy-level’
):
(setq url-cookie-untrusted-urls '(".*"))
Say you want to ignore all cookies except those set by the EmacsWiki site. Using ".*"
for ‘url-cookie-untrusted-urls’
in this case will always result in the cookie being rejected, because it matches the whole string. Try the following regular expression instead:
(setq url-cookie-trusted-urls '("^http://\\(www\\.\\)?emacswiki\\.org/.*") url-cookie-untrusted-urls '("^https?://"))
Each cookie is a defstruct, and all cookies are stored in a list defined by ‘url-cookie-storage’
and persistently in the file ‘url-cookie-file’
. Cookies are automatically saved to this file once every ‘url-cookie-save-interval’
seconds.
Out of the box, url supports HTTP basic and digest authentication. Other schemes can be added using ‘url-register-auth-scheme’
. For example, if you’re after NTLM support, try url-http-ntlm.
Proxies are supported by UrlPackage, by setting ‘url-proxy-services’
to an assoc list mapping url services and the proxy servers that support them.
(defcustom url-proxy-services nil "An alist of schemes and proxy servers that gateway them. Looks like ((\"http\" . \"hostname:portnumber\") ...). This is set up from the ACCESS_proxy environment variables." :type '(repeat (cons :format "%v" (string :tag "Protocol") (string :tag "Proxy"))) :group 'url)
Example (borrowed from TabithaTipper)
(setq url-proxy-services '(("http" . "http://proxy.example.com:8080") ("ftp" . "proxy.example.com:8080") ("no_proxy" . "^.*example.com")))
You can set ‘url-debug’
to a variety of values that log information to the buffer ‘*URL-DEBUG*’
(defcustom url-debug nil "What types of debug messages from the URL library to show. Debug messages are logged to the *URL-DEBUG* buffer. If t, all messages will be logged. If a number, all messages will be logged, as well shown via `message'. If a list, it is a list of the types of messages to be logged." :type '(choice (const :tag "none" nil) (const :tag "all" t) (checklist :tag "custom" (const :tag "HTTP" :value http) (const :tag "DAV" :value dav) (const :tag "General" :value retrieval) (const :tag "Filename handlers" :value handlers) (symbol :tag "Other"))) :group 'url-hairy)
There are a variety of HTTP Debugging Proxies out there, depending on your platform. Install one, and set the ‘url-proxy-services’
to point to it. The proxy will then capture the data streaming between emacs and your server.
On Windows, a good HTTP Debugging Proxy is Fiddler and it installs at localhost:8888, so set url-proxy-services to
(setq url-proxy-services '(("http" . "localhost:8888")))
If you just want to download HTTP documents but you don’t have url.el yet, try a very simple hack on DownloadFilesViaHttp.
See also: HttpPost, HttpGet and family
CategoryHypermedia CategoryWebBrowser CategoryCode CategoryExtensions