226 Commits

Author SHA1 Message Date
9b59c4c89b Merge upstream changes 2022-02-06 00:05:12 +05:30
Konstantin Rybakov
00d84614c2 Merge pull request #389 from Lockszmith/Fix-Dockerfile
Fixes #376 + .dockerignore improvement
2022-01-12 18:04:00 +03:00
Konstantin Rybakov
52e7cef7ef Merge pull request #402 from Sueqkjs/master
fixed spell
2022-01-12 17:08:29 +03:00
Sueqkjs
fbff1bc201 fixed spell
I fixed the spelling because it didn't work~ (crying)
2021-12-27 22:11:12 +09:00
Denis Berezin
7af15cc32d Merge pull request #384 from brunosaboia/remove-trailing-spaces
Remove trailing whitespaces
2021-11-22 17:44:40 +03:00
Denis Berezin
7f397ce753 Merge pull request #398 from toptal/npm-audit-issues
Fix high severity npm issues
2021-11-22 17:35:14 +03:00
Denis Berezin
8f8b039f65 Fix high severity npm issues 2021-11-22 17:33:31 +03:00
Denis Berezin
eeaf2d7b18 Merge pull request #393 from jmartin84/391
fixed STORAGE_USERNAME typo in dockerfile
2021-11-22 17:20:06 +03:00
Justen Martin
db0b7d6444 fixed STORAGE_USERNAME typo in dockerfile 2021-10-06 22:40:46 -05:00
lksz
db6e7603f9 Fixes #376 + .dockerignore improvement 2021-09-10 01:02:26 -04:00
20fb7f9bc2 Merge upstream new version 2021-08-25 00:16:57 +05:30
Bruno Saboia
ad5d7549d7 Remove trailing whitespaces
Dockerfile contained useless trailing whitespaces, which could generate
"ghost" diffs.
2021-07-14 11:00:42 +02:00
991f26e871 Serve plain text if Accept header isn't set 2020-10-23 15:18:37 +05:30
e2293900de Return text/plain if html isn't explicitly specified in Accept header 2020-10-23 15:10:05 +05:30
John Crepezzi
5d2965ffc5 Merge pull request #350 from seejohnrun/specify-config-on-boot
Allow setting config.js alternative on boot
2020-10-06 22:15:59 -04:00
John Crepezzi
f255928af7 Allow setting config.js alternative on boot
Closes #105
2020-10-06 22:15:13 -04:00
John Crepezzi
a108dbadc5 Merge pull request #349 from seejohnrun/add-head-support
Add support for HEAD requests
2020-10-06 22:07:19 -04:00
John Crepezzi
c409aca080 Add support for HEAD requests
On regular document endpoints, and on raw endpoints
2020-10-06 22:05:22 -04:00
John Crepezzi
219424550b Fix local name 2020-10-06 21:02:16 -04:00
John Crepezzi
f147acb51c Switch to using pg.Pool 2020-10-06 21:01:14 -04:00
John Crepezzi
9a692ed652 Get the client working as expected with pg 8 2020-10-06 21:01:10 -04:00
John Crepezzi
3a17c86a0f Upgrade pg to the most recent version
This isn't going to actually work yet, just getting things in place
2020-10-06 21:01:01 -04:00
John Crepezzi
677a22987a Merge pull request #122 from Roundaround/mongodb
Added mongodb document store adapter
2020-10-06 01:46:59 -04:00
John Crepezzi
89d912c6ff Merge pull request #347 from seejohnrun/fix-memcached
Fix memcached client fetch for key not found
2020-10-06 01:39:52 -04:00
John Crepezzi
4cac6713ef Fix memcached client fetch for key not found
The memcached client wasn't correctly handling looking up a key that
didn't exist.  Now we only try to push the expiration forward if there
is actually a value in memcached.

Also while I'm in here, allow expiration to be left blank.
2020-10-06 01:36:46 -04:00
John Crepezzi
f3b0de745b Merge pull request #200 from kevinhaendel/master
Added "user-select" option to line numbers & messages
2020-10-06 01:21:16 -04:00
John Crepezzi
cc8a99752f Merge pull request #271 from mklkj/fix-content-type
Fix contentType header in save request
2020-10-06 01:18:34 -04:00
John Crepezzi
6853d077e7 Merge branch 'master' into fix-content-type 2020-10-06 01:18:22 -04:00
John Crepezzi
80a2b6f0dd Merge pull request #241 from meseta/master
Add Google Datastore sorage handler
2020-10-06 01:15:24 -04:00
John Crepezzi
4f68b3d7d6 Merge branch 'master' into meseta/master 2020-10-06 01:13:07 -04:00
John Crepezzi
ef0ca40533 Downgrade pg for now
Will make a PR to use the new APIs soon
2020-10-06 00:54:12 -04:00
John Crepezzi
f372ef18de Merge pull request #345 from seejohnrun/fix-json-highlighting
Use the now-separate json mode for json highlighting
2020-10-06 00:37:21 -04:00
John Crepezzi
181a3a2bfa Use the now-separate json mode for json highlighting
Closes #267
2020-10-06 00:36:41 -04:00
John Crepezzi
61d08afb3b Merge pull request #344 from seejohnrun/upgrade-highlight-js
Update highlight JS to the most recent version (10.2.1)
2020-10-06 00:10:59 -04:00
John Crepezzi
1ba025328d Update highlight JS to the most recent version 2020-10-06 00:07:07 -04:00
John Crepezzi
a79fb39f54 Merge branch 'master' of github.com:seejohnrun/haste-server 2020-10-05 23:52:13 -04:00
John Crepezzi
3a72d74537 Fix security vulnerabilities from outdated packages
Closes #258
2020-10-05 23:50:28 -04:00
John Crepezzi
e9ae74b7a9 Merge pull request #322 from sethsmoe/patch-1
remove 1px margin from textarea, fixes useless scrollbar
2020-09-22 15:42:35 -04:00
John Crepezzi
c305e9a83d Merge pull request #342 from seejohnrun/dependabot/npm_and_yarn/bl-4.0.3
Bump bl from 4.0.2 to 4.0.3
2020-09-22 15:41:12 -04:00
dependabot[bot]
16bce4c83d Bump bl from 4.0.2 to 4.0.3
Bumps [bl](https://github.com/rvagg/bl) from 4.0.2 to 4.0.3.
- [Release notes](https://github.com/rvagg/bl/releases)
- [Commits](https://github.com/rvagg/bl/compare/v4.0.2...v4.0.3)

Signed-off-by: dependabot[bot] <support@github.com>
2020-09-22 19:40:54 +00:00
John Crepezzi
661997cd73 Merge pull request #334 from ourforks/master
[Security] Update dependencies to reduce risk
2020-09-22 15:40:27 -04:00
John Crepezzi
159f989d08 Merge pull request #335 from emillen/docker-support
Docker support
2020-09-22 15:39:20 -04:00
emil-lengman
139df62ec4 add newline to stop github complaining 2020-08-22 22:27:05 +02:00
emil-lengman
bae6387bb7 forgot to rename some vars 2020-08-22 22:25:10 +02:00
emil-lengman
bb7b9571a7 write some documentation for the Docker solution 2020-08-22 22:22:08 +02:00
emil-lengman
a4dc29fb2b its supposed to be milliseconds 2020-08-22 22:12:54 +02:00
emil-lengman
342f56ce1a use same password and username env vars for all types 2020-08-22 21:56:58 +02:00
emil-lengman
05ecc90764 add file path 2020-08-22 21:47:48 +02:00
emil-lengman
69cf505a90 remove pg connect string, add rethink user and password 2020-08-22 21:29:41 +02:00
emil-lengman
9f41993566 also install rethinkdb and aws-sdk 2020-08-22 21:06:02 +02:00
emil-lengman
5c9311fb85 remove unused import 2020-08-22 20:56:45 +02:00
emil-lengman
5a8d52a5e3 add healthcheck, and stopsignal, plus export the correct port 2020-08-22 20:56:11 +02:00
emil-lengman
0f145b4444 pin versions 2020-08-22 20:48:32 +02:00
emil-lengman
aef4bb5edb add dockerignore file 2020-08-22 20:45:27 +02:00
emil-lengman
36c854ef1b move creating the config file to a js file 2020-08-22 20:44:32 +02:00
emil-lengman
edd428ff37 fix some names for env vars 2020-08-22 20:43:39 +02:00
emil-lengman
0612ba001e basic docker-compose for running the project together with memcached 2020-08-22 17:33:40 +02:00
emil-lengman
064680003d basic dockerfile with default env vars 2020-08-22 17:33:19 +02:00
emil-lengman
655f2af45a script for turning env-vars into config.js 2020-08-22 17:33:01 +02:00
Reece Dunham
ce03749c2f Update dependencies to reduce security risk
Signed-off-by: Reece Dunham <me@rdil.rocks>
2020-08-12 23:45:18 +00:00
24ed412f50 Redo the UI. Light Theme. 2020-07-16 17:25:37 +05:30
7bd0fcc621 Turn background light. 2020-07-15 20:47:26 +05:30
e718672b58 Use tomorrow instead of tomorrow-night-bright. 2020-07-15 18:14:33 +05:30
epdn
f6084b4339 remove 1px margin from textarea, fixes useless scrollbar 2020-05-18 09:34:57 +01:00
John Crepezzi
9b0a5ff0a3 Merge pull request #291 from j3parker/s3-document-store
Add an Amazon S3 document store
2020-02-27 11:39:12 -05:00
a4ad0e1fa6 Update about.md 2019-07-27 20:41:23 +05:30
af28e0c5d9 [Breaking] Change config.js to config.json 2019-07-27 20:40:50 +05:30
240a9f7fde Serve local jquery. Use tomorrow-night-bright instead of solarized_dark 2019-07-17 16:48:33 +05:30
dca3237a71 Change the look. 2019-07-17 16:38:49 +05:30
a541630848 Add /:id/raw 2019-07-17 16:38:13 +05:30
e067323714 Change about.md 2019-07-17 16:30:37 +05:30
Jacob Parker
1fff48568f Document the IAM permissions 2019-07-08 16:59:04 +01:00
Jacob Parker
b4c666fbcf Add an Amazon S3 document store 2019-06-28 19:25:49 +01:00
John Crepezzi
b866c33c93 Merge pull request #173 from sebastiansterk/master
removed padding for #box for correct view
2019-04-05 17:30:42 -04:00
Mikołaj Pich
035cf0e91e Fix content type 2018-12-22 15:11:37 +01:00
John Crepezzi
f3838ab4a8 Merge pull request #251 from seejohnrun/handle-redis-disconnect
Handle redis error and re-establish connection
2018-09-19 10:38:34 -04:00
John Crepezzi
bf2b1c957a Handle redis error and re-establish connection 2018-09-19 10:37:34 -04:00
Yuan Gao
86bbc1899d Update README.md 2018-09-01 21:12:30 +01:00
Yuan Gao
d41d7491d4 rename to google-datastore, and use Date.now() 2018-09-01 21:11:58 +01:00
Yuan
5fb43eb67c added condition for this.expire not defined 2018-08-28 01:28:26 +01:00
Yuan
1eeef4ede4 restored using null 2018-08-28 01:21:37 +01:00
Yuan
ebc749c5e0 updated readme 2018-08-28 00:37:21 +01:00
Yuan
b0bbb72f35 updated to use Date(null) 2018-08-28 00:35:09 +01:00
Yuan
2213c3874a updated readme 2018-08-27 23:48:19 +01:00
Yuan
6ebd72a86c updated readme 2018-08-27 23:34:56 +01:00
Yuan
b6814a1445 bugfixes 2018-08-27 23:15:02 +01:00
Yuan
e3d18efdc6 added npm package 2018-08-27 23:01:37 +01:00
Yuan
869fb65738 added googledatastore handler 2018-08-27 22:59:05 +01:00
Yuan Gao
56b939124e Merge pull request #2 from seejohnrun/master
update from source
2018-08-27 22:58:29 +01:00
John Crepezzi
ee1c1c0856 Merge pull request #231 from seejohnrun/ensure-raw-utf8
Added charset to raw content type
2018-07-12 14:26:05 -04:00
John Crepezzi
b087ac8dd1 Added charset to raw content type
Closes #230
2018-07-12 14:25:27 -04:00
John Crepezzi
d922667f56 Merge pull request #221 from PassTheMayo/patch-1
Fixed unnecessary logging when document not found
2018-04-30 19:45:16 -04:00
Jacob Gunther
5f6fefa7a6 Fixed unnecessary logging when document not found 2018-04-30 16:40:28 -05:00
John Crepezzi
faa7e679ca Merge pull request #216 from PassTheMayo/master
Fixed RethinkDB document store
2018-04-16 11:53:28 -04:00
Jacob Gunther
cd3bf26dbe Use local method for md5 2018-04-16 10:52:53 -05:00
Jacob Gunther
830dc1bc43 Use uploads table 2018-04-15 23:16:39 -05:00
Jacob Gunther
dc0f151a7f Fixed bug in RethinkDB document store and use classes 2018-04-15 23:16:08 -05:00
John Crepezzi
7f625e22f7 Merge pull request #203 from seejohnrun/rewrite_memcached
Rewrite the memcached client
2018-04-13 16:10:42 -04:00
John Crepezzi
528b7b07a8 Merge pull request #215 from Razzeee/patch-1
Update docs to real defaults
2018-04-09 13:08:36 -04:00
Razzeee
2b81e67ce7 Update docs to real defaults 2018-04-09 19:05:21 +02:00
John Crepezzi
827e7b51b5 Rewrite the memcached client
* Update syntax to ES6
* Use `memcached` instead of `memcache`
* Fix restrictions where expirations weren't pushed forward on GET
* Fix a bug where we were unnecessarily bumping expirations on key search

Closes #201
2018-02-16 09:52:44 -05:00
Kevin Händel
16d529e935 Added "user-select" option to line numbers & messages
This prevents copying unnecessary text after selecting it via Ctrl + A
2018-02-11 00:35:27 +01:00
John Crepezzi
ad7702aaf4 Merge pull request #194 from szepeviktor/patch-1
Change to HTTPS in about.md
2018-01-22 10:30:12 -05:00
Viktor Szépe
f5fbc8d19e Change to HTTPS in about.md
Please also change the repo URL here on GitHub
2018-01-22 14:51:15 +01:00
John Crepezzi
0a8923bf12 Merge pull request #192 from C0rn3j/master
Add note about paste expiration, cosmetic fixes.
2017-12-30 20:24:01 -05:00
Martin
4d572a2ec0 convert relative path to absolute 2017-12-30 17:11:29 +01:00
Martin
d9a53d3e6e Add note about paste expiration, cosmetic fixes. 2017-12-29 20:42:11 +01:00
John Crepezzi
8da37ea5de Merge pull request #190 from PassTheMayo/patch-1
Oh noes! I didn't even notice that I had a typo...
2017-12-11 10:51:12 -05:00
Jacob Gunther
ff0fccd6c2 Oh noes! I didn't even notice that I had a typo... 2017-12-11 09:50:50 -06:00
John Crepezzi
63c4576633 Merge pull request #189 from PassTheMayo/master
Added RethinkDB storage option & fixed config to use proper JSON
2017-12-11 10:46:45 -05:00
Jacob Gunther
b31d143bcd Revert config.js to previous state 2017-12-11 09:45:37 -06:00
Jacob Gunther
0d8aec8d61 Oops, forgot to fix that file name 2017-12-11 09:44:23 -06:00
Jacob Gunther
1f9fdd205d Undid changes to server.js 2017-12-11 09:28:27 -06:00
Jacob Gunther
cdd0cf3739 Fixed requested changes to RethinkDB handler 2017-12-11 09:27:44 -06:00
PassTheMayo
ba5c6b8d16 Added RethinkDB storage option & fixed config to use proper JSON 2017-12-09 18:34:00 -06:00
John Crepezzi
cfef588283 Merge pull request #181 from seejohnrun/simplify_uglify
Upgrade uglify and simplify usage
2017-10-31 21:20:23 -04:00
John Crepezzi
318c5f7ba6 Upgrade uglify and simplify usage
- Upgrade to the most recent version of uglify
- Use the `UglifyJS.minify(code)` helper which does exactly what we want
2017-10-31 21:19:22 -04:00
John Crepezzi
ee03e7cd78 Merge pull request #180 from seejohnrun/es6_generators
ES6 generators
2017-10-31 21:11:32 -04:00
John Crepezzi
3b6934e348 Phonetic key generator to es6 and add some tests 2017-10-31 21:10:25 -04:00
John Crepezzi
f161cc33b4 Added tests and converted dictionary key generator to es6 2017-10-31 20:55:59 -04:00
John Crepezzi
40f1f2588e Update some es6 2017-10-31 20:55:51 -04:00
John Crepezzi
e4e025f67e Convert random generator to es6 and add some specs for it directly 2017-10-31 20:40:43 -04:00
John Crepezzi
e12805a8aa Merge pull request #179 from seejohnrun/upgrade_testing
Upgrade testing libraries
2017-10-31 20:04:37 -04:00
John Crepezzi
e76c845f16 Upgrade testing libraries
- Upgrade mocha
- Remove should due to limited usage and old style (at least by rspec standards)
- Move spec -> test which is the default
- Update tests accordingly for the above
2017-10-31 20:03:30 -04:00
John Crepezzi
072418695e Merge pull request #178 from seejohnrun/upgrade_highlight
Upgrade highlight.js
2017-10-31 19:49:20 -04:00
John Crepezzi
584b66bc66 Upgrade highlight.js 2017-10-31 19:48:55 -04:00
Sebastian Sterk
f8db455f74 removed padding for #box for correct view 2017-10-13 15:13:04 +02:00
John Crepezzi
f19c5d1049 Fix typo in README
Closes #165
2017-08-03 11:45:51 +00:00
John Crepezzi
c5b859ec98 Bump node engine version & fix asset compression on start 2017-07-11 21:13:33 -04:00
John Crepezzi
2ee93a7409 Merge pull request #160 from seejohnrun/complete_eslint
Fix eslint
2017-06-26 12:38:57 -04:00
John Crepezzi
bf1dbb68b8 Fix eslint 2017-06-26 12:38:17 -04:00
John Crepezzi
cf28e23d8e Merge pull request #159 from seejohnrun/add_eslint
Added eslint and fixed an issue from #158
2017-06-26 12:29:10 -04:00
John Crepezzi
5939dec185 Added eslint and fixed an issue from #158 2017-06-26 12:19:36 -04:00
John Crepezzi
3ed1d775ac Merge pull request #158 from KlasafGeijerstam/master
Added dictionary key generator
2017-06-26 12:11:44 -04:00
John Crepezzi
87b1c76aaf One more 2017-06-26 12:11:19 -04:00
John Crepezzi
4599203bdf A few style nit-picks 2017-06-26 12:10:57 -04:00
Klas af Geijerstam
d66bc9a6c4 Removed unused lines 2017-06-26 18:09:13 +02:00
Klas af Geijerstam
80f0618736 Updated dictionary.js
Now expects a newline separated dictionary, supports both \n and \n\r
2017-06-26 18:03:18 +02:00
Klas af Geijerstam
ac2bceefbb Added missing ) 2017-06-26 17:42:24 +02:00
Klas af Geijerstam
dbf4f6b5dd Removed usage of random-js
Replaced random-js with vanilla JS random
2017-06-26 17:39:32 +02:00
Klas af Geijerstam
8e9205cecc Update dictionary.js 2017-06-26 17:37:04 +02:00
Klas af Geijerstam
e54a860172 Added dictionary.js
A key generator that uses a dictionary to create its keys
2017-06-26 17:17:52 +02:00
John Crepezzi
5a8697cdd8 Fix typo in about.md 2017-05-02 17:32:34 -04:00
John Crepezzi
091ea973a8 Merge pull request #138 from Wohlstand/patch-2
Added a note about Redis's password field
2017-04-21 07:51:23 -04:00
John Crepezzi
939b7221ab Merge pull request #146 from seejohnrun/dont_expire_static_documents
Don't expire static documents on raw read
2017-01-28 11:58:12 -05:00
John Crepezzi
934aaf7f51 Don't expire static documents on raw read 2017-01-28 11:57:25 -05:00
Vitaly Novichkov
930e21ccb7 Added a note about Redis's password field 2016-11-06 02:25:51 +04:00
John Crepezzi
eb5c8eef6a Merge pull request #135 from seejohnrun/horizontal_scroll_fix
Fix horizontal scroll overflow
2016-09-30 10:40:15 -05:00
John Crepezzi
03dd611a86 Fix horizontal scroll overflow
Caused by highlight.js upgrade
2016-09-30 09:57:46 -05:00
John Crepezzi
f24376b192 Merge pull request #129 from Wohlstand/patch-1
Fix working error under Firefox
2016-09-19 09:36:56 -04:00
Vitaly Novichkov
eea359d0ec Update index.html
Short charset command better for HTML5
2016-09-13 15:11:20 +04:00
John Crepezzi
af9a71549b Upgrade highlight.js
Also adds swift support

Closes #69
2016-09-12 20:10:15 -04:00
John Crepezzi
3178676fba Merge pull request #131 from seejohnrun/update_doc
Update documentation to mention redis-server
2016-08-22 13:23:11 -04:00
John Crepezzi
3bdfab8219 Update documentation to mention redis-server 2016-08-22 13:22:49 -04:00
John Crepezzi
a3a24d9765 Merge pull request #127 from Gwemox/master
Compatibility with screen reader
2016-08-17 14:03:13 -04:00
Vitaly Novichkov
8afb53e77e Fix working error under Firefox
On some computers because undeclared charset, highlight.min.js parsing incorrectly while some charsets (For example, Cyrillic-1251) are toggled. Problem is going away when I manually toggling Europan or UTF-8 charset.
2016-08-03 14:15:58 +04:00
Thibault Buathier
d6d9cf40f9 Compatibility with screen reader 2016-07-30 16:15:53 +02:00
Thibault Buathier
1010a142e2 Compatibility with screen reader 2016-07-30 16:15:26 +02:00
Evan Steinkerchner
d3db5e2a5d Added mongodb document store adapter 2016-06-10 16:43:43 -04:00
John Crepezzi
0209375865 Revert "Added cloudron external provider"
This reverts commit 00a9d9c312.
2016-03-12 13:09:28 -05:00
John Crepezzi
00a9d9c312 Added cloudron external provider 2016-03-12 13:07:58 -05:00
John Crepezzi
6835eef468 Merge pull request #109 from seejohnrun/rate_limiting
Added user-configurable rate limiting
2016-03-10 11:44:57 -10:00
John Crepezzi
4626fd9c8d Merge pull request #81 from pangeacake/master
Add postgres information to README.md
2016-03-06 19:06:04 -05:00
John Crepezzi
fbb6e63c37 Added a note to the README 2016-03-06 16:34:33 -05:00
John Crepezzi
84c909a5db Added user-configurable rate limiting 2016-03-06 16:20:40 -05:00
John Crepezzi
45e19bc7cc fix indentation 2015-12-27 12:59:59 -05:00
John Crepezzi
233bc6ff16 Merge pull request #77 from abn/redis-auth
Support authentication for redis store if password provided
2015-12-27 12:58:24 -05:00
PangeaCake
360b325ced Smaller typo 2015-01-07 14:30:12 -08:00
PangeaCake
e93f98112b Add pg as dependency and update node version
One of the dependencies seemed to be broken with the previous node version, but this node version worked perfectly
2015-01-07 14:27:46 -08:00
PangeaCake
05cb051bc8 Tiny fix 2015-01-07 14:25:43 -08:00
PangeaCake
031cdd738a Add postgres information to README.md 2015-01-07 14:24:50 -08:00
Arun Babu Neelicattu
c92ab077c0 Support authentication for redis store if password provided 2014-11-21 23:17:19 +10:00
John Crepezzi
6c31389327 Merge pull request #64 from lidl/master
Make table creation comment a one-liner.
2014-07-16 07:54:00 -04:00
lidl
a8d4f3c300 Make table creation comment a one-liner. 2014-06-27 18:40:05 -04:00
John Crepezzi
ab029eae2f Added postgres adapter 2014-06-09 16:50:43 -04:00
John Crepezzi
447d0aae76 PG basis 2014-06-09 14:48:35 -04:00
John Crepezzi
4870158430 Merge branch 'master' of github.com:seejohnrun/haste-server 2014-04-21 14:17:09 -04:00
John Crepezzi
0471b059a0 Support a form-data POST API
Closes #54
2014-04-21 14:16:23 -04:00
John Crepezzi
5bbe50b481 Merge pull request #59 from jomo/phonetic
Phonetic key improvements
2014-04-15 11:00:49 -04:00
JonApps
bda2749879 oops 🍺 2014-03-25 02:23:31 +01:00
JonApps
028aa96b13 phonetic keys can begin with vowel + added missing \'z\' to consontants 2014-03-25 02:20:05 +01:00
John Crepezzi
2deda5b68a Merge pull request #56 from joeykrim/patch-1
Small typo
2014-02-27 19:26:31 -05:00
joeykrim
ee7098457e Small typo
Changed "its all open source" to "it's all open source"
2014-02-27 17:57:37 -05:00
John Crepezzi
7a08960414 Merge branch 'master' of github.com:seejohnrun/haste-server 2013-11-24 11:54:34 -05:00
John Crepezzi
89909747f1 Don't depend on err.message for redis errors [#49] 2013-11-24 11:54:01 -05:00
John Crepezzi
202e695e07 Remove GA from index.html on Master 2013-10-31 08:44:33 -04:00
John Crepezzi
48e8e79659 Remove support from README 2013-08-13 13:06:37 -04:00
John Crepezzi
abb49f2cf3 update about.md 2013-03-12 21:59:10 -04:00
John Crepezzi
d1cd2a5213 Proper 2013-01-11 10:00:35 -08:00
John Crepezzi
27317844e0 Remove nl 2013-01-11 09:57:10 -08:00
John Crepezzi
ee74e2fa90 Merge branch 'production' of github.com:seejohnrun/haste-server into production 2012-12-29 18:15:45 -05:00
John Crepezzi
5d8bd2e6f8 Merge branch 'master' into production 2012-12-29 18:15:32 -05:00
John Crepezzi
cd4c7aeab8 Merge pull request #37 from naftis/patch-1
Bugfix to solarized_dark.css
2012-12-28 07:10:36 -08:00
naftis
e37c3cf1b9 Bugfix to solarized_dark.css
Fixed bug with font-style: italic and WebKit.
WebKit makes line-height bigger, when italic is used.
2012-12-28 16:11:21 +02:00
John Crepezzi
8858bab985 Merge branch 'master' into production 2012-12-23 10:54:15 -05:00
John Crepezzi
afb0c332cc Added shift modifier to twitter shortcut
Closes #29
2012-12-23 10:53:53 -05:00
John Crepezzi
82c58c5c0c Merge branch 'master' into production 2012-12-19 08:18:14 -05:00
John Crepezzi
46bdd27431 Fix for type name ;)
Closes #28
2012-12-19 08:17:52 -05:00
John Crepezzi
1adfba1a37 Merge branch 'master' into production 2012-12-19 08:13:36 -05:00
John Crepezzi
54e55b1b0d Added JSON to extension map (JS)
Closes #28
2012-12-19 08:12:08 -05:00
John Crepezzi
08d37cc7f7 Added support section 2012-10-22 14:40:52 -04:00
John Crepezzi
aa781957e8 Update Copyright 2012-09-27 13:46:27 -04:00
John Crepezzi
c00477c93c Update Copyright 2012-09-27 13:46:09 -04:00
John Crepezzi
035f09ac05 GA 2012-09-27 13:43:53 -04:00
John Crepezzi
36e00bb29e Remove 'localhost' references 2012-09-27 12:03:52 -04:00
John Crepezzi
10623873e8 Allow host setting by ENV 2012-09-27 12:01:00 -04:00
John Crepezzi
e536ba1019 Move to an available npm version 2012-09-27 11:56:49 -04:00
John Crepezzi
85fc36d710 Update npm version 2012-09-27 11:56:15 -04:00
John Crepezzi
5d5ae164f3 Set up node engine version 2012-09-27 11:54:40 -04:00
John Crepezzi
79309c75df Bump version to 0.1.0 2012-09-27 11:51:15 -04:00
John Crepezzi
4b58c8d356 Added more loggin 2012-09-27 11:50:56 -04:00
John Crepezzi
8f0d6260b0 change how redistogo install works 2012-09-27 11:50:12 -04:00
John Crepezzi
93a83a35da Logging 2012-09-27 11:47:23 -04:00
John Crepezzi
4efc5d47d9 Allow redistogo 2012-09-27 11:46:53 -04:00
John Crepezzi
ff8ef54e34 Procfile 2012-09-27 11:38:14 -04:00
John Crepezzi
814a49812a Update server config path 2012-09-19 14:28:52 -04:00
John Crepezzi
e0610bc1be Fix multiple document loading
Closes #32
2012-08-13 11:33:20 -04:00
John Crepezzi
962976c204 Pad the right 2012-06-22 15:33:07 -04:00
John Crepezzi
16080bdc16 Update description - preparing for npm push 2012-04-21 23:49:39 -04:00
John Crepezzi
20ce741341 Fix indentation 2012-04-07 23:51:48 -04:00
John Crepezzi
13bb094fb3 Revert "Refactor frontend"
This reverts commit 1950cc8db0.
2012-03-19 18:17:39 -04:00
John Crepezzi
b43a55ffda Merge pull request #25 from zaeleus/backbone
Refactor frontend
2012-03-19 15:11:08 -07:00
John Crepezzi
45cbdcce70 Force down connect version 2012-03-02 14:07:59 -05:00
Michael Macias
1950cc8db0 Refactor frontend
* restructured JavaScript using backbone.js
* replaced highlight.js with CodeMirror for its editor
* added CodeMirror Solarized (dark) theme based on Ethan Schoonover's solarized.vim
* changed `POST /document` to accept real JSON
* cleaned up template and stylesheet
2012-02-18 02:40:56 -06:00
John Crepezzi
90cfe0ec57 Upgrade jquery to 1.7.1 2012-02-07 17:52:48 -05:00
John Crepezzi
87e28548b9 Explicitly set encoding
Closes #24
2012-02-07 17:52:31 -05:00
614 changed files with 3117 additions and 78350 deletions

8
.dockerignore Normal file
View File

@@ -0,0 +1,8 @@
Dockerfile
.git
npm-debug.log
node_modules
*.swp
*.swo
data
*.DS_Store

2
.eslintignore Normal file
View File

@@ -0,0 +1,2 @@
**/*.min.js
config.js

25
.eslintrc.json Normal file
View File

@@ -0,0 +1,25 @@
{
"env": {
"es6": true,
"node": true
},
"extends": "eslint:recommended",
"rules": {
"indent": [
"error",
2
],
"linebreak-style": [
"error",
"unix"
],
"quotes": [
"error",
"single"
],
"semi": [
"error",
"always"
]
}
}

2
.gitignore vendored
View File

@@ -1,5 +1,7 @@
npm-debug.log
node_modules
*.swp
*.swo
data
*.DS_Store
config.json

68
Dockerfile Normal file
View File

@@ -0,0 +1,68 @@
FROM node:14.8.0-stretch
RUN mkdir -p /usr/src/app && \
chown node:node /usr/src/app
USER node:node
WORKDIR /usr/src/app
COPY --chown=node:node . .
RUN npm install && \
npm install redis@0.8.1 && \
npm install pg@4.1.1 && \
npm install memcached@2.2.2 && \
npm install aws-sdk@2.738.0 && \
npm install rethinkdbdash@2.3.31
ENV STORAGE_TYPE=memcached \
STORAGE_HOST=127.0.0.1 \
STORAGE_PORT=11211\
STORAGE_EXPIRE_SECONDS=2592000\
STORAGE_DB=2 \
STORAGE_AWS_BUCKET= \
STORAGE_AWS_REGION= \
STORAGE_USENAME= \
STORAGE_PASSWORD= \
STORAGE_FILEPATH=
ENV LOGGING_LEVEL=verbose \
LOGGING_TYPE=Console \
LOGGING_COLORIZE=true
ENV HOST=0.0.0.0\
PORT=7777\
KEY_LENGTH=10\
MAX_LENGTH=400000\
STATIC_MAX_AGE=86400\
RECOMPRESS_STATIC_ASSETS=true
ENV KEYGENERATOR_TYPE=phonetic \
KEYGENERATOR_KEYSPACE=
ENV RATELIMITS_NORMAL_TOTAL_REQUESTS=500\
RATELIMITS_NORMAL_EVERY_MILLISECONDS=60000 \
RATELIMITS_WHITELIST_TOTAL_REQUESTS= \
RATELIMITS_WHITELIST_EVERY_MILLISECONDS= \
# comma separated list for the whitelisted \
RATELIMITS_WHITELIST=example1.whitelist,example2.whitelist \
\
RATELIMITS_BLACKLIST_TOTAL_REQUESTS= \
RATELIMITS_BLACKLIST_EVERY_MILLISECONDS= \
# comma separated list for the blacklisted \
RATELIMITS_BLACKLIST=example1.blacklist,example2.blacklist
ENV DOCUMENTS=about=./about.md
EXPOSE ${PORT}
STOPSIGNAL SIGINT
ENTRYPOINT [ "bash", "docker-entrypoint.sh" ]
HEALTHCHECK --interval=30s --timeout=30s --start-period=5s \
--retries=3 CMD [ "sh", "-c", "echo -n 'curl localhost:7777... '; \
(\
curl -sf localhost:7777 > /dev/null\
) && echo OK || (\
echo Fail && exit 2\
)"]
CMD ["npm", "start"]

1
Procfile Normal file
View File

@@ -0,0 +1 @@
web: node server.js

259
README.md
View File

@@ -31,21 +31,31 @@ STDOUT. Check the README there for more details and usages.
1. Download the package, and expand it
2. Explore the settings inside of config.js, but the defaults should be good
3. `npm install`
4. `npm start`
4. `npm start` (you may specify an optional `<config-path>` as well)
## Settings
* `host` - the host the server runs on (default localhost)
* `port` - the port the server runs on (default 7777)
* `keyLength` - the length of the keys to user (default 10)
* `maxLength` - maximum length of a paste (default none)
* `maxLength` - maximum length of a paste (default 400000)
* `staticMaxAge` - max age for static assets (86400)
* `recompressStatisAssets` - whether or not to compile static js assets (true)
* `recompressStaticAssets` - whether or not to compile static js assets (true)
* `documents` - static documents to serve (ex: http://hastebin.com/about.com)
in addition to static assets. These will never expire.
* `storage` - storage options (see below)
* `logging` - logging preferences
* `keyGenerator` - key generator options (see below)
* `rateLimits` - settings for rate limiting (see below)
## Rate Limiting
When present, the `rateLimits` option enables built-in rate limiting courtesy
of `connect-ratelimit`. Any of the options supported by that library can be
used and set in `config.js`.
See the README for [connect-ratelimit](https://github.com/dharmafly/connect-ratelimit)
for more information!
## Key Generation
@@ -55,7 +65,7 @@ Attempts to generate phonetic keys, similar to `pwgen`
``` json
{
"type": "phonetic"
"type": "phonetic"
}
```
@@ -65,7 +75,7 @@ Generates a random key
``` json
{
"type": "random",
"type": "random",
"keyspace": "abcdef"
}
```
@@ -82,16 +92,19 @@ something like:
``` json
{
"path": "./data",
"type": "file"
"path": "./data",
"type": "file"
}
```
Where `path` represents where you want the files stored
where `path` represents where you want the files stored.
File storage currently does not support paste expiration, you can follow [#191](https://github.com/seejohnrun/haste-server/issues/191) for status updates.
### Redis
To use redis storage you must install the redis package in npm
To use redis storage you must install the `redis` package in npm, and have
`redis-server` running on the machine.
`npm install redis`
@@ -99,10 +112,10 @@ Once you've done that, your config section should look like:
``` json
{
"type": "redis",
"host": "localhost",
"port": 6379,
"db": 2
"type": "redis",
"host": "localhost",
"port": 6379,
"db": 2
}
```
@@ -112,19 +125,70 @@ or post.
All of which are optional except `type` with very logical default values.
### Memcached
If your Redis server is configured for password authentification, use the `password` field.
To use memcached storage you must install the `memcache` package via npm
### Postgres
`npm install memcache`
To use postgres storage you must install the `pg` package in npm
`npm install pg`
Once you've done that, your config section should look like:
``` json
{
"type": "memcached",
"host": "127.0.0.1",
"port": 11211
"type": "postgres",
"connectionUrl": "postgres://user:password@host:5432/database"
}
```
You can also just set the environment variable for `DATABASE_URL` to your database connection url.
You will have to manually add a table to your postgres database:
`create table entries (id serial primary key, key varchar(255) not null, value text not null, expiration int, unique(key));`
You can also set an `expire` option to the number of seconds to expire keys in.
This is off by default, but will constantly kick back expirations on each view
or post.
All of which are optional except `type` with very logical default values.
### MongoDB
To use mongodb storage you must install the 'mongodb' package in npm
`npm install mongodb`
Once you've done that, your config section should look like:
``` json
{
"type": "mongo",
"connectionUrl": "mongodb://localhost:27017/database"
}
```
You can also just set the environment variable for `DATABASE_URL` to your database connection url.
Unlike with postgres you do NOT have to create the table in your mongo database prior to running.
You can also set an `expire` option to the number of seconds to expire keys in.
This is off by default, but will constantly kick back expirations on each view or post.
### Memcached
To use memcache storage you must install the `memcached` package via npm
`npm install memcached`
Once you've done that, your config section should look like:
``` json
{
"type": "memcached",
"host": "127.0.0.1",
"port": 11211
}
```
@@ -134,6 +198,161 @@ forward on GETs.
All of which are optional except `type` with very logical default values.
### RethinkDB
To use the RethinkDB storage system, you must install the `rethinkdbdash` package via npm
`npm install rethinkdbdash`
Once you've done that, your config section should look like this:
``` json
{
"type": "rethinkdb",
"host": "127.0.0.1",
"port": 28015,
"db": "haste"
}
```
In order for this to work, the database must be pre-created before the script is ran.
Also, you must create an `uploads` table, which will store all the data for uploads.
You can optionally add the `user` and `password` properties to use a user system.
### Google Datastore
To use the Google Datastore storage system, you must install the `@google-cloud/datastore` package via npm
`npm install @google-cloud/datastore`
Once you've done that, your config section should look like this:
``` json
{
"type": "google-datastore"
}
```
Authentication is handled automatically by [Google Cloud service account credentials](https://cloud.google.com/docs/authentication/getting-started), by providing authentication details to the GOOGLE_APPLICATION_CREDENTIALS environmental variable.
### Amazon S3
To use [Amazon S3](https://aws.amazon.com/s3/) as a storage system, you must
install the `aws-sdk` package via npm:
`npm install aws-sdk`
Once you've done that, your config section should look like this:
```json
{
"type": "amazon-s3",
"bucket": "your-bucket-name",
"region": "us-east-1"
}
```
Authentication is handled automatically by the client. Check
[Amazon's documentation](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/setting-credentials-node.html)
for more information. You will need to grant your role these permissions to
your bucket:
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::your-bucket-name-goes-here/*"
}
]
}
```
## Docker
### Build image
```bash
docker build --tag haste-server .
```
### Run container
For this example we will run haste-server, and connect it to a redis server
```bash
docker run --name haste-server-container --env STORAGE_TYPE=redis --env STORAGE_HOST=redis-server --env STORAGE_PORT=6379 haste-server
```
### Use docker-compose example
There is an example `docker-compose.yml` which runs haste-server together with memcached
```bash
docker-compose up
```
### Configuration
The docker image is configured using environmental variables as you can see in the example above.
Here is a list of all the environment variables
### Storage
| Name | Default value | Description |
| :--------------------: | :-----------: | :-----------------------------------------------------------------------------------------------------------: |
| STORAGE_TYPE | memcached | Type of storage . Accepted values: "memcached","redis","postgres","rethinkdb", "amazon-s3", and "file" |
| STORAGE_HOST | 127.0.0.1 | Storage host. Applicable for types: memcached, redis, postgres, and rethinkdb |
| STORAGE_PORT | 11211 | Port on the storage host. Applicable for types: memcached, redis, postgres, and rethinkdb |
| STORAGE_EXPIRE_SECONDS | 2592000 | Number of seconds to expire keys in. Applicable for types. Redis, postgres, memcached. `expire` option to the |
| STORAGE_DB | 2 | The name of the database. Applicable for redis, postgres, and rethinkdb |
| STORAGE_PASSWORD | | Password for database. Applicable for redis, postges, rethinkdb . |
| STORAGE_USERNAME | | Database username. Applicable for postgres, and rethinkdb |
| STORAGE_AWS_BUCKET | | Applicable for amazon-s3. This is the name of the S3 bucket |
| STORAGE_AWS_REGION | | Applicable for amazon-s3. The region in which the bucket is located |
| STORAGE_FILEPATH | | Path to file to save data to. Applicable for type file |
### Logging
| Name | Default value | Description |
| :---------------: | :-----------: | :---------: |
| LOGGING_LEVEL | verbose | |
| LOGGING_TYPE= | Console |
| LOGGING_COLORIZE= | true |
### Basics
| Name | Default value | Description |
| :----------------------: | :--------------: | :---------------------------------------------------------------------------------------: |
| HOST | 0.0.0.0 | The hostname which the server answers on |
| PORT | 7777 | The port on which the server is running |
| KEY_LENGTH | 10 | the length of the keys to user |
| MAX_LENGTH | 400000 | maximum length of a paste |
| STATIC_MAX_AGE | 86400 | max age for static assets |
| RECOMPRESS_STATIC_ASSETS | true | whether or not to compile static js assets |
| KEYGENERATOR_TYPE | phonetic | Type of key generator. Acceptable values: "phonetic", or "random" |
| KEYGENERATOR_KEYSPACE | | keySpace argument is a string of acceptable characters |
| DOCUMENTS | about=./about.md | Comma separated list of static documents to serve. ex: \n about=./about.md,home=./home.md |
### Rate limits
| Name | Default value | Description |
| :----------------------------------: | :-----------------------------------: | :--------------------------------------------------------------------------------------: |
| RATELIMITS_NORMAL_TOTAL_REQUESTS | 500 | By default anyone uncategorized will be subject to 500 requests in the defined timespan. |
| RATELIMITS_NORMAL_EVERY_MILLISECONDS | 60000 | The timespan to allow the total requests for uncategorized users |
| RATELIMITS_WHITELIST_TOTAL_REQUESTS | | By default client names in the whitelist will not have their requests limited. |
| RATELIMITS_WHITELIST_EVERY_SECONDS | | By default client names in the whitelist will not have their requests limited. |
| RATELIMITS_WHITELIST | example1.whitelist,example2.whitelist | Comma separated list of the clients which are in the whitelist pool |
| RATELIMITS_BLACKLIST_TOTAL_REQUESTS | | By default client names in the blacklist will be subject to 0 requests per hours. |
| RATELIMITS_BLACKLIST_EVERY_SECONDS | | By default client names in the blacklist will be subject to 0 requests per hours |
| RATELIMITS_BLACKLIST | example1.blacklist,example2.blacklist | Comma separated list of the clients which are in the blacklistpool. |
## Author
@@ -143,7 +362,7 @@ John Crepezzi <john.crepezzi@gmail.com>
(The MIT License)
Copyright © 2011 John Crepezzi
Copyright © 2011-2012 John Crepezzi
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the Software), to deal in

View File

@@ -15,44 +15,19 @@ To make a new entry, click "New" (or type 'control + n')
## From the Console
Most of the time I want to show you some text, its coming from my current
console session. We should make it really easy to take code from the console
and send it to people.
[bin-client](git.webionite.com/ceda_ei/bin-client)
`cat something | haste` # http://hastebin.com/1238193
You can even take this a step further, and cut out the last step of copying the
URL with:
* osx: `cat something | haste | pbcopy`
* linux: `cat something | haste | xsel`
* windows: check out [WinHaste](https://github.com/ajryan/WinHaste)
After running that, the STDOUT output of `cat something` will show up at a URL
which has been conveniently copied to your clipboard.
That's all there is to that, and you can install it with `gem install haste`
right now.
* osx: you will need to have an up to date version of Xcode
* linux: you will need to have rubygems and ruby-devel installed
## Duration
Pastes will stay for 30 days from their last view.
## Privacy
While the contents of hastebin.com are not directly crawled by any search robot
that obeys "robots.txt", there should be no great expectation of privacy. Post
things at your own risk. Not responsible for any loss of data or removed
pastes.
Add the following to your bashrc/zshrc
```
export MKR_BIN='https://bin.webionite.com/'
export HASTEBIN=1
```
## Open Source
Haste can easily be installed behind your network, and its all open source!
Haste can easily be installed behind your network, and it's all open source!
* [haste-client](https://github.com/seejohnrun/haste-client)
* [haste-server](https://github.com/seejohnrun/haste-server)
* [haste-server](https://git.webionite.com/Webionite/haste-server)
## Author

19
docker-compose.yaml Normal file
View File

@@ -0,0 +1,19 @@
version: '3.0'
services:
haste-server:
build: .
networks:
- db-network
environment:
- STORAGE_TYPE=memcached
- STORAGE_HOST=memcached
- STORAGE_PORT=11211
ports:
- 7777:7777
memcached:
image: memcached:latest
networks:
- db-network
networks:
db-network:

108
docker-entrypoint.js Normal file
View File

@@ -0,0 +1,108 @@
const {
HOST,
PORT,
KEY_LENGTH,
MAX_LENGTH,
STATIC_MAX_AGE,
RECOMPRESS_STATIC_ASSETS,
STORAGE_TYPE,
STORAGE_HOST,
STORAGE_PORT,
STORAGE_EXPIRE_SECONDS,
STORAGE_DB,
STORAGE_AWS_BUCKET,
STORAGE_AWS_REGION,
STORAGE_PASSWORD,
STORAGE_USERNAME,
STORAGE_FILEPATH,
LOGGING_LEVEL,
LOGGING_TYPE,
LOGGING_COLORIZE,
KEYGENERATOR_TYPE,
KEY_GENERATOR_KEYSPACE,
RATE_LIMITS_NORMAL_TOTAL_REQUESTS,
RATE_LIMITS_NORMAL_EVERY_MILLISECONDS,
RATE_LIMITS_WHITELIST_TOTAL_REQUESTS,
RATE_LIMITS_WHITELIST_EVERY_MILLISECONDS,
RATE_LIMITS_WHITELIST,
RATE_LIMITS_BLACKLIST_TOTAL_REQUESTS,
RATE_LIMITS_BLACKLIST_EVERY_MILLISECONDS,
RATE_LIMITS_BLACKLIST,
DOCUMENTS,
} = process.env;
const config = {
host: HOST,
port: PORT,
keyLength: KEY_LENGTH,
maxLength: MAX_LENGTH,
staticMaxAge: STATIC_MAX_AGE,
recompressStaticAssets: RECOMPRESS_STATIC_ASSETS,
logging: [
{
level: LOGGING_LEVEL,
type: LOGGING_TYPE,
colorize: LOGGING_COLORIZE,
},
],
keyGenerator: {
type: KEYGENERATOR_TYPE,
keyspace: KEY_GENERATOR_KEYSPACE,
},
rateLimits: {
whitelist: RATE_LIMITS_WHITELIST ? RATE_LIMITS_WHITELIST.split(",") : [],
blacklist: RATE_LIMITS_BLACKLIST ? RATE_LIMITS_BLACKLIST.split(",") : [],
categories: {
normal: {
totalRequests: RATE_LIMITS_NORMAL_TOTAL_REQUESTS,
every: RATE_LIMITS_NORMAL_EVERY_MILLISECONDS,
},
whitelist:
RATE_LIMITS_WHITELIST_EVERY_MILLISECONDS ||
RATE_LIMITS_WHITELIST_TOTAL_REQUESTS
? {
totalRequests: RATE_LIMITS_WHITELIST_TOTAL_REQUESTS,
every: RATE_LIMITS_WHITELIST_EVERY_MILLISECONDS,
}
: null,
blacklist:
RATE_LIMITS_BLACKLIST_EVERY_MILLISECONDS ||
RATE_LIMITS_BLACKLIST_TOTAL_REQUESTS
? {
totalRequests: RATE_LIMITS_WHITELIST_TOTAL_REQUESTS,
every: RATE_LIMITS_BLACKLIST_EVERY_MILLISECONDS,
}
: null,
},
},
storage: {
type: STORAGE_TYPE,
host: STORAGE_HOST,
port: STORAGE_PORT,
expire: STORAGE_EXPIRE_SECONDS,
bucket: STORAGE_AWS_BUCKET,
region: STORAGE_AWS_REGION,
connectionUrl: `postgres://${STORAGE_USERNAME}:${STORAGE_PASSWORD}@${STORAGE_HOST}:${STORAGE_PORT}/${STORAGE_DB}`,
db: STORAGE_DB,
user: STORAGE_USERNAME,
password: STORAGE_PASSWORD,
path: STORAGE_FILEPATH,
},
documents: DOCUMENTS
? DOCUMENTS.split(",").reduce((acc, item) => {
const keyAndValueArray = item.replace(/\s/g, "").split("=");
return { ...acc, [keyAndValueArray[0]]: keyAndValueArray[1] };
}, {})
: null,
};
console.log(JSON.stringify(config));

9
docker-entrypoint.sh Normal file
View File

@@ -0,0 +1,9 @@
#!/bin/bash
# We use this file to translate environmental variables to .env files used by the application
set -e
node ./docker-entrypoint.js > ./config.js
exec "$@"

View File

@@ -1,4 +1,5 @@
var winston = require('winston');
var Busboy = require('busboy');
// For handling serving stored documents
@@ -15,47 +16,68 @@ var DocumentHandler = function(options) {
DocumentHandler.defaultKeyLength = 10;
// Handle retrieving a document
DocumentHandler.prototype.handleGet = function(key, response, skipExpire) {
DocumentHandler.prototype.handleGet = function(request, response, config) {
const key = request.params.id.split('.')[0];
const skipExpire = !!config.documents[key];
this.store.get(key, function(ret) {
if (ret) {
winston.verbose('retrieved document', { key: key });
response.writeHead(200, { 'content-type': 'application/json' });
response.end(JSON.stringify({ data: ret, key: key }));
if (request.method === 'HEAD') {
response.end();
} else {
response.end(JSON.stringify({ data: ret, key: key }));
}
}
else {
winston.warn('document not found', { key: key });
response.writeHead(404, { 'content-type': 'application/json' });
response.end(JSON.stringify({ message: 'Document not found.' }));
if (request.method === 'HEAD') {
response.end();
} else {
response.end(JSON.stringify({ message: 'Document not found.' }));
}
}
}, skipExpire);
};
// Handle retrieving the raw version of a document
DocumentHandler.prototype.handleRawGet = function(key, response, skipExpire) {
DocumentHandler.prototype.handleRawGet = function(request, response, config) {
const key = request.params.id.split('.')[0];
const skipExpire = !!config.documents[key];
this.store.get(key, function(ret) {
if (ret) {
winston.verbose('retrieved raw document', { key: key });
response.writeHead(200, { 'content-type': 'text/plain' });
response.end(ret);
response.writeHead(200, { 'content-type': 'text/plain; charset=UTF-8' });
if (request.method === 'HEAD') {
response.end();
} else {
response.end(ret);
}
}
else {
winston.warn('raw document not found', { key: key });
response.writeHead(404, { 'content-type': 'application/json' });
response.end(JSON.stringify({ message: 'Document not found.' }));
if (request.method === 'HEAD') {
response.end();
} else {
response.end(JSON.stringify({ message: 'Document not found.' }));
}
}
}, skipExpire);
};
// Handle adding a new Document
DocumentHandler.prototype.handlePost = function(request, response) {
DocumentHandler.prototype.handlePost = function (request, response) {
var _this = this;
var buffer = '';
var cancelled = false;
request.on('data', function(data) {
if (!buffer) {
response.writeHead(200, { 'content-type': 'application/json' });
}
buffer += data.toString();
// What to do when done
var onSuccess = function () {
// Check length
if (_this.maxLength && buffer.length > _this.maxLength) {
cancelled = true;
winston.warn('document >maxLength', { maxLength: _this.maxLength });
@@ -63,14 +85,14 @@ DocumentHandler.prototype.handlePost = function(request, response) {
response.end(
JSON.stringify({ message: 'Document exceeds maximum length.' })
);
return;
}
});
request.on('end', function(end) {
if (cancelled) return;
_this.chooseKey(function(key) {
_this.store.set(key, buffer, function(res) {
// And then save if we should
_this.chooseKey(function (key) {
_this.store.set(key, buffer, function (res) {
if (res) {
winston.verbose('added document', { key: key });
response.writeHead(200, { 'content-type': 'application/json' });
response.end(JSON.stringify({ key: key }));
}
else {
@@ -80,12 +102,37 @@ DocumentHandler.prototype.handlePost = function(request, response) {
}
});
});
});
request.on('error', function(error) {
winston.error('connection error: ' + error.message);
response.writeHead(500, { 'content-type': 'application/json' });
response.end(JSON.stringify({ message: 'Connection error.' }));
});
};
// If we should, parse a form to grab the data
var ct = request.headers['content-type'];
if (ct && ct.split(';')[0] === 'multipart/form-data') {
var busboy = new Busboy({ headers: request.headers });
busboy.on('field', function (fieldname, val) {
if (fieldname === 'data') {
buffer = val;
}
});
busboy.on('finish', function () {
onSuccess();
});
request.pipe(busboy);
// Otherwise, use our own and just grab flat data from POST body
} else {
request.on('data', function (data) {
buffer += data.toString();
});
request.on('end', function () {
if (cancelled) { return; }
onSuccess();
});
request.on('error', function (error) {
winston.error('connection error: ' + error.message);
response.writeHead(500, { 'content-type': 'application/json' });
response.end(JSON.stringify({ message: 'Connection error.' }));
cancelled = true;
});
}
};
// Keep choosing keys until one isn't taken
@@ -98,7 +145,7 @@ DocumentHandler.prototype.chooseKey = function(callback) {
} else {
callback(key);
}
});
}, true); // Don't bump expirations when key searching
};
DocumentHandler.prototype.acceptableKey = function() {

View File

@@ -0,0 +1,56 @@
/*global require,module,process*/
var AWS = require('aws-sdk');
var winston = require('winston');
var AmazonS3DocumentStore = function(options) {
this.expire = options.expire;
this.bucket = options.bucket;
this.client = new AWS.S3({region: options.region});
};
AmazonS3DocumentStore.prototype.get = function(key, callback, skipExpire) {
var _this = this;
var req = {
Bucket: _this.bucket,
Key: key
};
_this.client.getObject(req, function(err, data) {
if(err) {
callback(false);
}
else {
callback(data.Body.toString('utf-8'));
if (_this.expire && !skipExpire) {
winston.warn('amazon s3 store cannot set expirations on keys');
}
}
});
}
AmazonS3DocumentStore.prototype.set = function(key, data, callback, skipExpire) {
var _this = this;
var req = {
Bucket: _this.bucket,
Key: key,
Body: data,
ContentType: 'text/plain'
};
_this.client.putObject(req, function(err, data) {
if (err) {
callback(false);
}
else {
callback(true);
if (_this.expire && !skipExpire) {
winston.warn('amazon s3 store cannot set expirations on keys');
}
}
});
}
module.exports = AmazonS3DocumentStore;

View File

@@ -0,0 +1,89 @@
/*global require,module,process*/
const Datastore = require('@google-cloud/datastore');
const winston = require('winston');
class GoogleDatastoreDocumentStore {
// Create a new store with options
constructor(options) {
this.kind = "Haste";
this.expire = options.expire;
this.datastore = new Datastore();
}
// Save file in a key
set(key, data, callback, skipExpire) {
var expireTime = (skipExpire || this.expire === undefined) ? null : new Date(Date.now() + this.expire * 1000);
var taskKey = this.datastore.key([this.kind, key])
var task = {
key: taskKey,
data: [
{
name: 'value',
value: data,
excludeFromIndexes: true
},
{
name: 'expiration',
value: expireTime
}
]
};
this.datastore.insert(task).then(() => {
callback(true);
})
.catch(err => {
callback(false);
});
}
// Get a file from a key
get(key, callback, skipExpire) {
var taskKey = this.datastore.key([this.kind, key])
this.datastore.get(taskKey).then((entity) => {
if (skipExpire || entity[0]["expiration"] == null) {
callback(entity[0]["value"]);
}
else {
// check for expiry
if (entity[0]["expiration"] < new Date()) {
winston.info("document expired", {key: key, expiration: entity[0]["expiration"], check: new Date(null)});
callback(false);
}
else {
// update expiry
var task = {
key: taskKey,
data: [
{
name: 'value',
value: entity[0]["value"],
excludeFromIndexes: true
},
{
name: 'expiration',
value: new Date(Date.now() + this.expire * 1000)
}
]
};
this.datastore.update(task).then(() => {
})
.catch(err => {
winston.error("failed to update expiration", {error: err});
});
callback(entity[0]["value"]);
}
}
})
.catch(err => {
winston.error("Error retrieving value from Google Datastore", {error: err});
callback(false);
});
}
}
module.exports = GoogleDatastoreDocumentStore;

View File

@@ -1,45 +1,54 @@
var memcached = require('memcache');
var winston = require('winston');
const memcached = require('memcached');
const winston = require('winston');
// Create a new store with options
var MemcachedDocumentStore = function(options) {
this.expire = options.expire;
if (!MemcachedDocumentStore.client) {
MemcachedDocumentStore.connect(options);
class MemcachedDocumentStore {
// Create a new store with options
constructor(options) {
this.expire = options.expire;
const host = options.host || '127.0.0.1';
const port = options.port || 11211;
const url = `${host}:${port}`;
this.connect(url);
}
};
// Create a connection
MemcachedDocumentStore.connect = function(options) {
var host = options.host || '127.0.0.1';
var port = options.port || 11211;
this.client = new memcached.Client(port, host);
this.client.connect();
this.client.on('connect', function() {
winston.info('connected to memcached on ' + host + ':' + port);
});
this.client.on('error', function(e) {
winston.info('error connecting to memcached', { error: e });
});
};
// Create a connection
connect(url) {
this.client = new memcached(url);
// Save file in a key
MemcachedDocumentStore.prototype.set =
function(key, data, callback, skipExpire) {
MemcachedDocumentStore.client.set(key, data, function(err, reply) {
err ? callback(false) : callback(true);
}, skipExpire ? 0 : this.expire);
};
winston.info(`connecting to memcached on ${url}`);
// Get a file from a key
MemcachedDocumentStore.prototype.get = function(key, callback, skipExpire) {
var _this = this;
MemcachedDocumentStore.client.get(key, function(err, reply) {
callback(err ? false : reply);
if (_this.expire && !skipExpire) {
winston.warn('store does not currently push forward expirations on GET');
}
});
};
this.client.on('failure', function(error) {
winston.info('error connecting to memcached', {error});
});
}
// Save file in a key
set(key, data, callback, skipExpire) {
this.client.set(key, data, skipExpire ? 0 : this.expire || 0, (error) => {
callback(!error);
});
}
// Get a file from a key
get(key, callback, skipExpire) {
this.client.get(key, (error, data) => {
const value = error ? false : data;
callback(value);
// Update the key so that the expiration is pushed forward
if (value && !skipExpire) {
this.set(key, data, (updateSucceeded) => {
if (!updateSucceeded) {
winston.error('failed to update expiration on GET', {key});
}
}, skipExpire);
}
});
}
}
module.exports = MemcachedDocumentStore;

View File

@@ -0,0 +1,88 @@
var MongoClient = require('mongodb').MongoClient,
winston = require('winston');
var MongoDocumentStore = function (options) {
this.expire = options.expire;
this.connectionUrl = process.env.DATABASE_URl || options.connectionUrl;
};
MongoDocumentStore.prototype.set = function (key, data, callback, skipExpire) {
var now = Math.floor(new Date().getTime() / 1000),
that = this;
this.safeConnect(function (err, db) {
if (err)
return callback(false);
db.collection('entries').update({
'entry_id': key,
$or: [
{ expiration: -1 },
{ expiration: { $gt: now } }
]
}, {
'entry_id': key,
'value': data,
'expiration': that.expire && !skipExpire ? that.expire + now : -1
}, {
upsert: true
}, function (err, existing) {
if (err) {
winston.error('error persisting value to mongodb', { error: err });
return callback(false);
}
callback(true);
});
});
};
MongoDocumentStore.prototype.get = function (key, callback, skipExpire) {
var now = Math.floor(new Date().getTime() / 1000),
that = this;
this.safeConnect(function (err, db) {
if (err)
return callback(false);
db.collection('entries').findOne({
'entry_id': key,
$or: [
{ expiration: -1 },
{ expiration: { $gt: now } }
]
}, function (err, entry) {
if (err) {
winston.error('error persisting value to mongodb', { error: err });
return callback(false);
}
callback(entry === null ? false : entry.value);
if (entry !== null && entry.expiration !== -1 && that.expire && !skipExpire) {
db.collection('entries').update({
'entry_id': key
}, {
$set: {
'expiration': that.expire + now
}
}, function (err, result) { });
}
});
});
};
MongoDocumentStore.prototype.safeConnect = function (callback) {
MongoClient.connect(this.connectionUrl, function (err, db) {
if (err) {
winston.error('error connecting to mongodb', { error: err });
callback(err);
} else {
callback(undefined, db);
}
});
};
module.exports = MongoDocumentStore;

View File

@@ -0,0 +1,80 @@
/*global require,module,process*/
var winston = require('winston');
const {Pool} = require('pg');
// create table entries (id serial primary key, key varchar(255) not null, value text not null, expiration int, unique(key));
// A postgres document store
var PostgresDocumentStore = function (options) {
this.expireJS = options.expire;
const connectionString = process.env.DATABASE_URL || options.connectionUrl;
this.pool = new Pool({connectionString});
};
PostgresDocumentStore.prototype = {
// Set a given key
set: function (key, data, callback, skipExpire) {
var now = Math.floor(new Date().getTime() / 1000);
var that = this;
this.safeConnect(function (err, client, done) {
if (err) { return callback(false); }
client.query('INSERT INTO entries (key, value, expiration) VALUES ($1, $2, $3)', [
key,
data,
that.expireJS && !skipExpire ? that.expireJS + now : null
], function (err) {
if (err) {
winston.error('error persisting value to postgres', { error: err });
return callback(false);
}
callback(true);
done();
});
});
},
// Get a given key's data
get: function (key, callback, skipExpire) {
var now = Math.floor(new Date().getTime() / 1000);
var that = this;
this.safeConnect(function (err, client, done) {
if (err) { return callback(false); }
client.query('SELECT id,value,expiration from entries where KEY = $1 and (expiration IS NULL or expiration > $2)', [key, now], function (err, result) {
if (err) {
winston.error('error retrieving value from postgres', { error: err });
return callback(false);
}
callback(result.rows.length ? result.rows[0].value : false);
if (result.rows.length && that.expireJS && !skipExpire) {
client.query('UPDATE entries SET expiration = $1 WHERE ID = $2', [
that.expireJS + now,
result.rows[0].id
], function (err) {
if (!err) {
done();
}
});
} else {
done();
}
});
});
},
// A connection wrapper
safeConnect: function (callback) {
this.pool.connect((error, client, done) => {
if (error) {
winston.error('error connecting to postgres', {error});
callback(error);
} else {
callback(undefined, client, done);
}
});
}
};
module.exports = PostgresDocumentStore;

View File

@@ -8,9 +8,13 @@ var winston = require('winston');
// options[db] - The db to use (default 0)
// options[expire] - The time to live for each key set (default never)
var RedisDocumentStore = function(options) {
var RedisDocumentStore = function(options, client) {
this.expire = options.expire;
if (!RedisDocumentStore.client) {
if (client) {
winston.info('using predefined redis client');
RedisDocumentStore.client = client;
} else if (!RedisDocumentStore.client) {
winston.info('configuring redis');
RedisDocumentStore.connect(options);
}
};
@@ -21,11 +25,20 @@ RedisDocumentStore.connect = function(options) {
var port = options.port || 6379;
var index = options.db || 0;
RedisDocumentStore.client = redis.createClient(port, host);
RedisDocumentStore.client.select(index, function(err, reply) {
// authenticate if password is provided
if (options.password) {
RedisDocumentStore.client.auth(options.password);
}
RedisDocumentStore.client.on('error', function(err) {
winston.error('redis disconnected', err);
});
RedisDocumentStore.client.select(index, function(err) {
if (err) {
winston.error(
'error connecting to redis index ' + index,
{ error: err.message }
{ error: err }
);
process.exit(1);
}
@@ -38,7 +51,7 @@ RedisDocumentStore.connect = function(options) {
// Save file in a key
RedisDocumentStore.prototype.set = function(key, data, callback, skipExpire) {
var _this = this;
RedisDocumentStore.client.set(key, data, function(err, reply) {
RedisDocumentStore.client.set(key, data, function(err) {
if (err) {
callback(false);
}
@@ -54,7 +67,7 @@ RedisDocumentStore.prototype.set = function(key, data, callback, skipExpire) {
// Expire a key in expire time if set
RedisDocumentStore.prototype.setExpiration = function(key) {
if (this.expire) {
RedisDocumentStore.client.expire(key, this.expire, function(err, reply) {
RedisDocumentStore.client.expire(key, this.expire, function(err) {
if (err) {
winston.error('failed to set expiry on key: ' + key);
}

View File

@@ -0,0 +1,46 @@
const crypto = require('crypto');
const rethink = require('rethinkdbdash');
const winston = require('winston');
const md5 = (str) => {
const md5sum = crypto.createHash('md5');
md5sum.update(str);
return md5sum.digest('hex');
};
class RethinkDBStore {
constructor(options) {
this.client = rethink({
silent: true,
host: options.host || '127.0.0.1',
port: options.port || 28015,
db: options.db || 'haste',
user: options.user || 'admin',
password: options.password || ''
});
}
set(key, data, callback) {
this.client.table('uploads').insert({ id: md5(key), data: data }).run((error) => {
if (error) {
callback(false);
winston.error('failed to insert to table', error);
return;
}
callback(true);
});
}
get(key, callback) {
this.client.table('uploads').get(md5(key)).run((error, result) => {
if (error || !result) {
callback(false);
if (error) winston.error('failed to insert to table', error);
return;
}
callback(result.data);
});
}
}
module.exports = RethinkDBStore;

View File

@@ -0,0 +1,32 @@
const fs = require('fs');
module.exports = class DictionaryGenerator {
constructor(options, readyCallback) {
// Check options format
if (!options) throw Error('No options passed to generator');
if (!options.path) throw Error('No dictionary path specified in options');
// Load dictionary
fs.readFile(options.path, 'utf8', (err, data) => {
if (err) throw err;
this.dictionary = data.split(/[\n\r]+/);
if (readyCallback) readyCallback();
});
}
// Generates a dictionary-based key, of keyLength words
createKey(keyLength) {
let text = '';
for (let i = 0; i < keyLength; i++) {
const index = Math.floor(Math.random() * this.dictionary.length);
text += this.dictionary[index];
}
return text;
}
};

View File

@@ -1,32 +1,27 @@
// Draws inspiration from pwgen and http://tools.arantius.com/password
var PhoneticKeyGenerator = function(options) {
// No options
const randOf = (collection) => {
return () => {
return collection[Math.floor(Math.random() * collection.length)];
};
};
// Generate a phonetic key
PhoneticKeyGenerator.prototype.createKey = function(keyLength) {
var text = '';
for (var i = 0; i < keyLength; i++) {
text += (i % 2 == 0) ? this.randConsonant() : this.randVowel();
// Helper methods to get an random vowel or consonant
const randVowel = randOf('aeiou');
const randConsonant = randOf('bcdfghjklmnpqrstvwxyz');
module.exports = class PhoneticKeyGenerator {
// Generate a phonetic key of alternating consonant & vowel
createKey(keyLength) {
let text = '';
const start = Math.round(Math.random());
for (let i = 0; i < keyLength; i++) {
text += (i % 2 == start) ? randConsonant() : randVowel();
}
return text;
}
return text;
};
PhoneticKeyGenerator.consonants = 'bcdfghjklmnpqrstvwxy';
PhoneticKeyGenerator.vowels = 'aeiou';
// Get an random vowel
PhoneticKeyGenerator.prototype.randVowel = function() {
return PhoneticKeyGenerator.vowels[
Math.floor(Math.random() * PhoneticKeyGenerator.vowels.length)
];
};
// Get an random consonant
PhoneticKeyGenerator.prototype.randConsonant = function() {
return PhoneticKeyGenerator.consonants[
Math.floor(Math.random() * PhoneticKeyGenerator.consonants.length)
];
};
module.exports = PhoneticKeyGenerator;

View File

@@ -1,19 +1,20 @@
var RandomKeyGenerator = function(options) {
if (!options) {
options = {};
}
this.keyspace = options.keyspace || 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
};
module.exports = class RandomKeyGenerator {
// Generate a random key
RandomKeyGenerator.prototype.createKey = function(keyLength) {
var text = '';
var index;
for (var i = 0; i < keyLength; i++) {
index = Math.floor(Math.random() * this.keyspace.length);
text += this.keyspace.charAt(index);
// Initialize a new generator with the given keySpace
constructor(options = {}) {
this.keyspace = options.keyspace || 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
}
return text;
};
module.exports = RandomKeyGenerator;
// Generate a key of the given length
createKey(keyLength) {
var text = '';
for (var i = 0; i < keyLength; i++) {
const index = Math.floor(Math.random() * this.keyspace.length);
text += this.keyspace.charAt(index);
}
return text;
}
};

1
node_modules/.bin/_mocha generated vendored
View File

@@ -1 +0,0 @@
../mocha/bin/_mocha

1
node_modules/.bin/mocha generated vendored
View File

@@ -1 +0,0 @@
../mocha/bin/mocha

1
node_modules/.bin/uglifyjs generated vendored
View File

@@ -1 +0,0 @@
../uglify-js/bin/uglifyjs

11
node_modules/connect/.npmignore generated vendored
View File

@@ -1,11 +0,0 @@
*.markdown
*.md
.git*
Makefile
benchmarks/
docs/
examples/
install.sh
support/
test/
.DS_Store

24
node_modules/connect/LICENSE generated vendored
View File

@@ -1,24 +0,0 @@
(The MIT License)
Copyright (c) 2010 Sencha Inc.
Copyright (c) 2011 LearnBoost
Copyright (c) 2011 TJ Holowaychuk
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

2
node_modules/connect/index.js generated vendored
View File

@@ -1,2 +0,0 @@
module.exports = require('./lib/connect');

81
node_modules/connect/lib/cache.js generated vendored
View File

@@ -1,81 +0,0 @@
/*!
* Connect - Cache
* Copyright(c) 2011 Sencha Inc.
* MIT Licensed
*/
/**
* Expose `Cache`.
*/
module.exports = Cache;
/**
* LRU cache store.
*
* @param {Number} limit
* @api private
*/
function Cache(limit) {
this.store = {};
this.keys = [];
this.limit = limit;
}
/**
* Touch `key`, promoting the object.
*
* @param {String} key
* @param {Number} i
* @api private
*/
Cache.prototype.touch = function(key, i){
this.keys.splice(i,1);
this.keys.push(key);
};
/**
* Remove `key`.
*
* @param {String} key
* @api private
*/
Cache.prototype.remove = function(key){
delete this.store[key];
};
/**
* Get the object stored for `key`.
*
* @param {String} key
* @return {Array}
* @api private
*/
Cache.prototype.get = function(key){
return this.store[key];
};
/**
* Add a cache `key`.
*
* @param {String} key
* @return {Array}
* @api private
*/
Cache.prototype.add = function(key){
// initialize store
var len = this.keys.push(key);
// limit reached, invalid LRU
if (len > this.limit) this.remove(this.keys.shift());
var arr = this.store[key] = [];
arr.createdAt = new Date;
return arr;
};

106
node_modules/connect/lib/connect.js generated vendored
View File

@@ -1,106 +0,0 @@
/*!
* Connect
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var HTTPServer = require('./http').Server
, HTTPSServer = require('./https').Server
, fs = require('fs');
// node patches
require('./patch');
// expose createServer() as the module
exports = module.exports = createServer;
/**
* Framework version.
*/
exports.version = '1.8.5';
/**
* Initialize a new `connect.HTTPServer` with the middleware
* passed to this function. When an object is passed _first_,
* we assume these are the tls options, and return a `connect.HTTPSServer`.
*
* Examples:
*
* An example HTTP server, accepting several middleware.
*
* var server = connect.createServer(
* connect.logger()
* , connect.static(__dirname + '/public')
* );
*
* An HTTPS server, utilizing the same middleware as above.
*
* var server = connect.createServer(
* { key: key, cert: cert }
* , connect.logger()
* , connect.static(__dirname + '/public')
* );
*
* Alternatively with connect 1.0 we may omit `createServer()`.
*
* connect(
* connect.logger()
* , connect.static(__dirname + '/public')
* ).listen(3000);
*
* @param {Object|Function} ...
* @return {Server}
* @api public
*/
function createServer() {
if ('object' == typeof arguments[0]) {
return new HTTPSServer(arguments[0], Array.prototype.slice.call(arguments, 1));
} else {
return new HTTPServer(Array.prototype.slice.call(arguments));
}
};
// support connect.createServer()
exports.createServer = createServer;
// auto-load getters
exports.middleware = {};
/**
* Auto-load bundled middleware with getters.
*/
fs.readdirSync(__dirname + '/middleware').forEach(function(filename){
if (/\.js$/.test(filename)) {
var name = filename.substr(0, filename.lastIndexOf('.'));
exports.middleware.__defineGetter__(name, function(){
return require('./middleware/' + name);
});
}
});
// expose utils
exports.utils = require('./utils');
// expose getters as first-class exports
exports.utils.merge(exports, exports.middleware);
// expose constructors
exports.HTTPServer = HTTPServer;
exports.HTTPSServer = HTTPSServer;

217
node_modules/connect/lib/http.js generated vendored
View File

@@ -1,217 +0,0 @@
/*!
* Connect - HTTPServer
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var http = require('http')
, parse = require('url').parse
, assert = require('assert');
// environment
var env = process.env.NODE_ENV || 'development';
/**
* Initialize a new `Server` with the given `middleware`.
*
* Examples:
*
* var server = connect.createServer(
* connect.favicon()
* , connect.logger()
* , connect.static(__dirname + '/public')
* );
*
* @params {Array} middleware
* @return {Server}
* @api public
*/
var Server = exports.Server = function HTTPServer(middleware) {
this.stack = [];
middleware.forEach(function(fn){
this.use(fn);
}, this);
http.Server.call(this, this.handle);
};
/**
* Inherit from `http.Server.prototype`.
*/
Server.prototype.__proto__ = http.Server.prototype;
/**
* Utilize the given middleware `handle` to the given `route`,
* defaulting to _/_. This "route" is the mount-point for the
* middleware, when given a value other than _/_ the middleware
* is only effective when that segment is present in the request's
* pathname.
*
* For example if we were to mount a function at _/admin_, it would
* be invoked on _/admin_, and _/admin/settings_, however it would
* not be invoked for _/_, or _/posts_.
*
* This is effectively the same as passing middleware to `connect.createServer()`,
* however provides a progressive api.
*
* Examples:
*
* var server = connect.createServer();
* server.use(connect.favicon());
* server.use(connect.logger());
* server.use(connect.static(__dirname + '/public'));
*
* If we wanted to prefix static files with _/public_, we could
* "mount" the `static()` middleware:
*
* server.use('/public', connect.static(__dirname + '/public'));
*
* This api is chainable, meaning the following is valid:
*
* connect.createServer()
* .use(connect.favicon())
* .use(connect.logger())
* .use(connect.static(__dirname + '/public'))
* .listen(3000);
*
* @param {String|Function} route or handle
* @param {Function} handle
* @return {Server}
* @api public
*/
Server.prototype.use = function(route, handle){
this.route = '/';
// default route to '/'
if ('string' != typeof route) {
handle = route;
route = '/';
}
// wrap sub-apps
if ('function' == typeof handle.handle) {
var server = handle;
server.route = route;
handle = function(req, res, next) {
server.handle(req, res, next);
};
}
// wrap vanilla http.Servers
if (handle instanceof http.Server) {
handle = handle.listeners('request')[0];
}
// normalize route to not trail with slash
if ('/' == route[route.length - 1]) {
route = route.substr(0, route.length - 1);
}
// add the middleware
this.stack.push({ route: route, handle: handle });
// allow chaining
return this;
};
/**
* Handle server requests, punting them down
* the middleware stack.
*
* @api private
*/
Server.prototype.handle = function(req, res, out) {
var writeHead = res.writeHead
, stack = this.stack
, removed = ''
, index = 0;
function next(err) {
var layer, path, c;
req.url = removed + req.url;
req.originalUrl = req.originalUrl || req.url;
removed = '';
layer = stack[index++];
// all done
if (!layer || res.headerSent) {
// but wait! we have a parent
if (out) return out(err);
// error
if (err) {
var msg = 'production' == env
? 'Internal Server Error'
: err.stack || err.toString();
// output to stderr in a non-test env
if ('test' != env) console.error(err.stack || err.toString());
// unable to respond
if (res.headerSent) return req.socket.destroy();
res.statusCode = 500;
res.setHeader('Content-Type', 'text/plain');
if ('HEAD' == req.method) return res.end();
res.end(msg);
} else {
res.statusCode = 404;
res.setHeader('Content-Type', 'text/plain');
if ('HEAD' == req.method) return res.end();
res.end('Cannot ' + req.method + ' ' + req.url);
}
return;
}
try {
path = parse(req.url).pathname;
if (undefined == path) path = '/';
// skip this layer if the route doesn't match.
if (0 != path.indexOf(layer.route)) return next(err);
c = path[layer.route.length];
if (c && '/' != c && '.' != c) return next(err);
// Call the layer handler
// Trim off the part of the url that matches the route
removed = layer.route;
req.url = req.url.substr(removed.length);
// Ensure leading slash
if ('/' != req.url[0]) req.url = '/' + req.url;
var arity = layer.handle.length;
if (err) {
if (arity === 4) {
layer.handle(err, req, res, next);
} else {
next(err);
}
} else if (arity < 4) {
layer.handle(req, res, next);
} else {
next();
}
} catch (e) {
if (e instanceof assert.AssertionError) {
console.error(e.stack + '\n');
next(e);
} else {
next(e);
}
}
}
next();
};

47
node_modules/connect/lib/https.js generated vendored
View File

@@ -1,47 +0,0 @@
/*!
* Connect - HTTPServer
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var HTTPServer = require('./http').Server
, https = require('https');
/**
* Initialize a new `Server` with the given
*`options` and `middleware`. The HTTPS api
* is identical to the [HTTP](http.html) server,
* however TLS `options` must be provided before
* passing in the optional middleware.
*
* @params {Object} options
* @params {Array} middleawre
* @return {Server}
* @api public
*/
var Server = exports.Server = function HTTPSServer(options, middleware) {
this.stack = [];
middleware.forEach(function(fn){
this.use(fn);
}, this);
https.Server.call(this, options, this.handle);
};
/**
* Inherit from `http.Server.prototype`.
*/
Server.prototype.__proto__ = https.Server.prototype;
// mixin HTTPServer methods
Object.keys(HTTPServer.prototype).forEach(function(method){
Server.prototype[method] = HTTPServer.prototype[method];
});

46
node_modules/connect/lib/index.js generated vendored
View File

@@ -1,46 +0,0 @@
/**
* # Connect
*
* Connect is a middleware framework for node,
* shipping with over 11 bundled middleware and a rich choice of
* [3rd-party middleware](https://github.com/senchalabs/connect/wiki).
*
* Installation:
*
* $ npm install connect
*
* API:
*
* - [connect](connect.html) general
* - [http](http.html) http server
* - [https](https.html) https server
*
* Middleware:
*
* - [logger](middleware-logger.html) request logger with custom format support
* - [csrf](middleware-csrf.html) Cross-site request forgery protection
* - [basicAuth](middleware-basicAuth.html) basic http authentication
* - [bodyParser](middleware-bodyParser.html) extensible request body parser
* - [cookieParser](middleware-cookieParser.html) cookie parser
* - [session](middleware-session.html) session management support with bundled [MemoryStore](middleware-session-memory.html)
* - [compiler](middleware-compiler.html) static asset compiler (sass, less, coffee-script, etc)
* - [methodOverride](middleware-methodOverride.html) faux HTTP method support
* - [responseTime](middleware-responseTime.html) calculates response-time and exposes via X-Response-Time
* - [router](middleware-router.html) provides rich Sinatra / Express-like routing
* - [staticCache](middleware-staticCache.html) memory cache layer for the static() middleware
* - [static](middleware-static.html) streaming static file server supporting `Range` and more
* - [directory](middleware-directory.html) directory listing middleware
* - [vhost](middleware-vhost.html) virtual host sub-domain mapping middleware
* - [favicon](middleware-favicon.html) efficient favicon server (with default icon)
* - [limit](middleware-limit.html) limit the bytesize of request bodies
* - [profiler](middleware-profiler.html) request profiler reporting response-time, memory usage, etc
* - [query](middleware-query.html) automatic querystring parser, populating `req.query`
* - [errorHandler](middleware-errorHandler.html) flexible error handler
*
* Internals:
*
* - connect [utilities](utils.html)
* - node monkey [patches](patch.html)
*
*/

View File

@@ -1,93 +0,0 @@
/*!
* Connect - basicAuth
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var utils = require('../utils')
, unauthorized = utils.unauthorized
, badRequest = utils.badRequest;
/**
* Enfore basic authentication by providing a `callback(user, pass)`,
* which must return `true` in order to gain access. Alternatively an async
* method is provided as well, invoking `callback(user, pass, callback)`. Populates
* `req.remoteUser`. The final alternative is simply passing username / password
* strings.
*
* Examples:
*
* connect(connect.basicAuth('username', 'password'));
*
* connect(
* connect.basicAuth(function(user, pass){
* return 'tj' == user & 'wahoo' == pass;
* })
* );
*
* connect(
* connect.basicAuth(function(user, pass, fn){
* User.authenticate({ user: user, pass: pass }, fn);
* })
* );
*
* @param {Function|String} callback or username
* @param {String} realm
* @api public
*/
module.exports = function basicAuth(callback, realm) {
var username, password;
// user / pass strings
if ('string' == typeof callback) {
username = callback;
password = realm;
if ('string' != typeof password) throw new Error('password argument required');
realm = arguments[2];
callback = function(user, pass){
return user == username && pass == password;
}
}
realm = realm || 'Authorization Required';
return function(req, res, next) {
var authorization = req.headers.authorization;
if (req.remoteUser) return next();
if (!authorization) return unauthorized(res, realm);
var parts = authorization.split(' ')
, scheme = parts[0]
, credentials = new Buffer(parts[1], 'base64').toString().split(':');
if ('Basic' != scheme) return badRequest(res);
// async
if (callback.length >= 3) {
var pause = utils.pause(req);
callback(credentials[0], credentials[1], function(err, user){
if (err || !user) return unauthorized(res, realm);
req.remoteUser = user;
next();
pause.resume();
});
// sync
} else {
if (callback(credentials[0], credentials[1])) {
req.remoteUser = credentials[0];
next();
} else {
unauthorized(res, realm);
}
}
}
};

View File

@@ -1,196 +0,0 @@
/*!
* Connect - bodyParser
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var qs = require('qs')
, formidable = require('formidable');
/**
* Extract the mime type from the given request's
* _Content-Type_ header.
*
* @param {IncomingMessage} req
* @return {String}
* @api private
*/
function mime(req) {
var str = req.headers['content-type'] || '';
return str.split(';')[0];
}
/**
* Parse request bodies.
*
* By default _application/json_, _application/x-www-form-urlencoded_,
* and _multipart/form-data_ are supported, however you may map `connect.bodyParser.parse[contentType]`
* to a function receiving `(req, options, callback)`.
*
* Examples:
*
* connect.createServer(
* connect.bodyParser()
* , function(req, res) {
* res.end('viewing user ' + req.body.user.name);
* }
* );
*
* $ curl -d 'user[name]=tj' http://localhost/
* $ curl -d '{"user":{"name":"tj"}}' -H "Content-Type: application/json" http://localhost/
*
* Multipart req.files:
*
* As a security measure files are stored in a separate object, stored
* as `req.files`. This prevents attacks that may potentially alter
* filenames, and depending on the application gain access to restricted files.
*
* Multipart configuration:
*
* The `options` passed are provided to each parser function.
* The _multipart/form-data_ parser merges these with formidable's
* IncomingForm object, allowing you to tweak the upload directory,
* size limits, etc. For example you may wish to retain the file extension
* and change the upload directory:
*
* server.use(bodyParser({ uploadDir: '/www/mysite.com/uploads' }));
*
* View [node-formidable](https://github.com/felixge/node-formidable) for more information.
*
* If you wish to use formidable directly within your app, and do not
* desire this behaviour for multipart requests simply remove the
* parser:
*
* delete connect.bodyParser.parse['multipart/form-data'];
*
* Or
*
* delete express.bodyParser.parse['multipart/form-data'];
*
* @param {Object} options
* @return {Function}
* @api public
*/
exports = module.exports = function bodyParser(options){
options = options || {};
return function bodyParser(req, res, next) {
if (req.body) return next();
req.body = {};
if ('GET' == req.method || 'HEAD' == req.method) return next();
var parser = exports.parse[mime(req)];
if (parser) {
parser(req, options, next);
} else {
next();
}
}
};
/**
* Parsers.
*/
exports.parse = {};
/**
* Parse application/x-www-form-urlencoded.
*/
exports.parse['application/x-www-form-urlencoded'] = function(req, options, fn){
var buf = '';
req.setEncoding('utf8');
req.on('data', function(chunk){ buf += chunk });
req.on('end', function(){
try {
req.body = buf.length
? qs.parse(buf)
: {};
fn();
} catch (err){
fn(err);
}
});
};
/**
* Parse application/json.
*/
exports.parse['application/json'] = function(req, options, fn){
var buf = '';
req.setEncoding('utf8');
req.on('data', function(chunk){ buf += chunk });
req.on('end', function(){
try {
req.body = buf.length
? JSON.parse(buf)
: {};
fn();
} catch (err){
fn(err);
}
});
};
/**
* Parse multipart/form-data.
*
* TODO: make multiple support optional
* TODO: revisit "error" flag if it's a formidable bug
*/
exports.parse['multipart/form-data'] = function(req, options, fn){
var form = new formidable.IncomingForm
, data = {}
, files = {}
, done;
Object.keys(options).forEach(function(key){
form[key] = options[key];
});
function ondata(name, val, data){
if (Array.isArray(data[name])) {
data[name].push(val);
} else if (data[name]) {
data[name] = [data[name], val];
} else {
data[name] = val;
}
}
form.on('field', function(name, val){
ondata(name, val, data);
});
form.on('file', function(name, val){
ondata(name, val, files);
});
form.on('error', function(err){
fn(err);
done = true;
});
form.on('end', function(){
if (done) return;
try {
req.body = qs.parse(data);
req.files = qs.parse(files);
fn();
} catch (err) {
fn(err);
}
});
form.parse(req);
};

View File

@@ -1,163 +0,0 @@
/*!
* Connect - compiler
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var fs = require('fs')
, path = require('path')
, parse = require('url').parse;
/**
* Require cache.
*/
var cache = {};
/**
* Setup compiler.
*
* Options:
*
* - `src` Source directory, defaults to **CWD**.
* - `dest` Destination directory, defaults `src`.
* - `enable` Array of enabled compilers.
*
* Compilers:
*
* - `sass` Compiles sass to css
* - `less` Compiles less to css
* - `coffeescript` Compiles coffee to js
*
* @param {Object} options
* @api public
*/
exports = module.exports = function compiler(options){
options = options || {};
var srcDir = options.src || process.cwd()
, destDir = options.dest || srcDir
, enable = options.enable;
if (!enable || enable.length === 0) {
throw new Error('compiler\'s "enable" option is not set, nothing will be compiled.');
}
return function compiler(req, res, next){
if ('GET' != req.method) return next();
var pathname = parse(req.url).pathname;
for (var i = 0, len = enable.length; i < len; ++i) {
var name = enable[i]
, compiler = compilers[name];
if (compiler.match.test(pathname)) {
var src = (srcDir + pathname).replace(compiler.match, compiler.ext)
, dest = destDir + pathname;
// Compare mtimes
fs.stat(src, function(err, srcStats){
if (err) {
if ('ENOENT' == err.code) {
next();
} else {
next(err);
}
} else {
fs.stat(dest, function(err, destStats){
if (err) {
// Oh snap! it does not exist, compile it
if ('ENOENT' == err.code) {
compile();
} else {
next(err);
}
} else {
// Source has changed, compile it
if (srcStats.mtime > destStats.mtime) {
compile();
} else {
// Defer file serving
next();
}
}
});
}
});
// Compile to the destination
function compile() {
fs.readFile(src, 'utf8', function(err, str){
if (err) {
next(err);
} else {
compiler.compile(str, function(err, str){
if (err) {
next(err);
} else {
fs.writeFile(dest, str, 'utf8', function(err){
next(err);
});
}
});
}
});
}
return;
}
}
next();
};
};
/**
* Bundled compilers:
*
* - [sass](http://github.com/visionmedia/sass.js) to _css_
* - [less](http://github.com/cloudhead/less.js) to _css_
* - [coffee](http://github.com/jashkenas/coffee-script) to _js_
*/
var compilers = exports.compilers = {
sass: {
match: /\.css$/,
ext: '.sass',
compile: function(str, fn){
var sass = cache.sass || (cache.sass = require('sass'));
try {
fn(null, sass.render(str));
} catch (err) {
fn(err);
}
}
},
less: {
match: /\.css$/,
ext: '.less',
compile: function(str, fn){
var less = cache.less || (cache.less = require('less'));
try {
less.render(str, fn);
} catch (err) {
fn(err);
}
}
},
coffeescript: {
match: /\.js$/,
ext: '.coffee',
compile: function(str, fn){
var coffee = cache.coffee || (cache.coffee = require('coffee-script'));
try {
fn(null, coffee.compile(str));
} catch (err) {
fn(err);
}
}
}
};

View File

@@ -1,46 +0,0 @@
/*!
* Connect - cookieParser
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var utils = require('./../utils');
/**
* Parse _Cookie_ header and populate `req.cookies`
* with an object keyed by the cookie names.
*
* Examples:
*
* connect.createServer(
* connect.cookieParser()
* , function(req, res, next){
* res.end(JSON.stringify(req.cookies));
* }
* );
*
* @return {Function}
* @api public
*/
module.exports = function cookieParser(){
return function cookieParser(req, res, next) {
var cookie = req.headers.cookie;
if (req.cookies) return next();
req.cookies = {};
if (cookie) {
try {
req.cookies = utils.parseCookie(cookie);
} catch (err) {
return next(err);
}
}
next();
};
};

View File

@@ -1,105 +0,0 @@
/*!
* Connect - csrf
* Copyright(c) 2011 Sencha Inc.
* MIT Licensed
*/
/**
* Module dependencies.
*/
var utils = require('../utils')
, crypto = require('crypto');
/**
* CRSF protection middleware.
*
* By default this middleware generates a token named "_csrf"
* which should be added to requests which mutate
* state, within a hidden form field, query-string etc. This
* token is validated against the visitor's `req.session._csrf`
* property which is re-generated per request.
*
* The default `value` function checks `req.body` generated
* by the `bodyParser()` middleware, `req.query` generated
* by `query()`, and the "X-CSRF-Token" header field.
*
* This middleware requires session support, thus should be added
* somewhere _below_ `session()` and `cookieParser()`.
*
* Examples:
*
* var form = '\n\
* <form action="/" method="post">\n\
* <input type="hidden" name="_csrf" value="{token}" />\n\
* <input type="text" name="user[name]" value="{user}" />\n\
* <input type="password" name="user[pass]" />\n\
* <input type="submit" value="Login" />\n\
* </form>\n\
* ';
*
* connect(
* connect.cookieParser()
* , connect.session({ secret: 'keyboard cat' })
* , connect.bodyParser()
* , connect.csrf()
*
* , function(req, res, next){
* if ('POST' != req.method) return next();
* req.session.user = req.body.user;
* next();
* }
*
* , function(req, res){
* res.setHeader('Content-Type', 'text/html');
* var body = form
* .replace('{token}', req.session._csrf)
* .replace('{user}', req.session.user && req.session.user.name || '');
* res.end(body);
* }
* ).listen(3000);
*
* Options:
*
* - `value` a function accepting the request, returning the token
*
* @param {Object} options
* @api public
*/
module.exports = function csrf(options) {
var options = options || {}
, value = options.value || defaultValue;
return function(req, res, next){
// generate CSRF token
var token = req.session._csrf || (req.session._csrf = utils.uid(24));
// ignore GET (for now)
if ('GET' == req.method) return next();
// determine value
var val = value(req);
// check
if (val != token) return utils.forbidden(res);
next();
}
};
/**
* Default value function, checking the `req.body`
* and `req.query` for the CSRF token.
*
* @param {IncomingMessage} req
* @return {String}
* @api private
*/
function defaultValue(req) {
return (req.body && req.body._csrf)
|| (req.query && req.query._csrf)
|| (req.headers['x-csrf-token']);
}

View File

@@ -1,222 +0,0 @@
/*!
* Connect - directory
* Copyright(c) 2011 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
// TODO: icon / style for directories
// TODO: arrow key navigation
// TODO: make icons extensible
/**
* Module dependencies.
*/
var fs = require('fs')
, parse = require('url').parse
, utils = require('../utils')
, path = require('path')
, normalize = path.normalize
, extname = path.extname
, join = path.join;
/**
* Icon cache.
*/
var cache = {};
/**
* Serve directory listings with the given `root` path.
*
* Options:
*
* - `hidden` display hidden (dot) files. Defaults to false.
* - `icons` display icons. Defaults to false.
* - `filter` Apply this filter function to files. Defaults to false.
*
* @param {String} root
* @param {Object} options
* @return {Function}
* @api public
*/
exports = module.exports = function directory(root, options){
options = options || {};
// root required
if (!root) throw new Error('directory() root path required');
var hidden = options.hidden
, icons = options.icons
, filter = options.filter
, root = normalize(root);
return function directory(req, res, next) {
var accept = req.headers.accept || 'text/plain'
, url = parse(req.url)
, dir = decodeURIComponent(url.pathname)
, path = normalize(join(root, dir))
, originalUrl = parse(req.originalUrl)
, originalDir = decodeURIComponent(originalUrl.pathname)
, showUp = path != root && path != root + '/';
// null byte(s)
if (~path.indexOf('\0')) return utils.badRequest(res);
// malicious path
if (0 != path.indexOf(root)) return utils.forbidden(res);
// check if we have a directory
fs.stat(path, function(err, stat){
if (err) return 'ENOENT' == err.code
? next()
: next(err);
if (!stat.isDirectory()) return next();
// fetch files
fs.readdir(path, function(err, files){
if (err) return next(err);
if (!hidden) files = removeHidden(files);
if (filter) files = files.filter(filter);
files.sort();
// content-negotiation
for (var key in exports) {
if (~accept.indexOf(key) || ~accept.indexOf('*/*')) {
exports[key](req, res, files, next, originalDir, showUp, icons);
return;
}
}
utils.notAcceptable(res);
});
});
};
};
/**
* Respond with text/html.
*/
exports.html = function(req, res, files, next, dir, showUp, icons){
fs.readFile(__dirname + '/../public/directory.html', 'utf8', function(err, str){
if (err) return next(err);
fs.readFile(__dirname + '/../public/style.css', 'utf8', function(err, style){
if (err) return next(err);
if (showUp) files.unshift('..');
str = str
.replace('{style}', style)
.replace('{files}', html(files, dir, icons))
.replace('{directory}', dir)
.replace('{linked-path}', htmlPath(dir));
res.setHeader('Content-Type', 'text/html');
res.setHeader('Content-Length', str.length);
res.end(str);
});
});
};
/**
* Respond with application/json.
*/
exports.json = function(req, res, files){
files = JSON.stringify(files);
res.setHeader('Content-Type', 'application/json');
res.setHeader('Content-Length', files.length);
res.end(files);
};
/**
* Respond with text/plain.
*/
exports.plain = function(req, res, files){
files = files.join('\n') + '\n';
res.setHeader('Content-Type', 'text/plain');
res.setHeader('Content-Length', files.length);
res.end(files);
};
/**
* Map html `dir`, returning a linked path.
*/
function htmlPath(dir) {
var curr = [];
return dir.split('/').map(function(part){
curr.push(part);
return '<a href="' + curr.join('/') + '">' + part + '</a>';
}).join(' / ');
}
/**
* Map html `files`, returning an html unordered list.
*/
function html(files, dir, useIcons) {
return '<ul id="files">' + files.map(function(file){
var icon = ''
, classes = [];
if (useIcons && '..' != file) {
icon = icons[extname(file)] || icons.default;
icon = '<img src="data:image/png;base64,' + load(icon) + '" />';
classes.push('icon');
}
return '<li><a href="'
+ join(dir, file)
+ '" class="'
+ classes.join(' ') + '"'
+ ' title="' + file + '">'
+ icon + file + '</a></li>';
}).join('\n') + '</ul>';
}
/**
* Load and cache the given `icon`.
*
* @param {String} icon
* @return {String}
* @api private
*/
function load(icon) {
if (cache[icon]) return cache[icon];
return cache[icon] = fs.readFileSync(__dirname + '/../public/icons/' + icon, 'base64');
}
/**
* Filter "hidden" `files`, aka files
* beginning with a `.`.
*
* @param {Array} files
* @return {Array}
* @api private
*/
function removeHidden(files) {
return files.filter(function(file){
return '.' != file[0];
});
}
/**
* Icon map.
*/
var icons = {
'.js': 'page_white_code_red.png'
, '.c': 'page_white_c.png'
, '.h': 'page_white_h.png'
, '.cc': 'page_white_cplusplus.png'
, '.php': 'page_white_php.png'
, '.rb': 'page_white_ruby.png'
, '.cpp': 'page_white_cplusplus.png'
, '.swf': 'page_white_flash.png'
, '.pdf': 'page_white_acrobat.png'
, 'default': 'page_white.png'
};

View File

@@ -1,100 +0,0 @@
/*!
* Connect - errorHandler
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var utils = require('../utils')
, url = require('url')
, fs = require('fs');
/**
* Flexible error handler, providing (_optional_) stack traces
* and error message responses for requests accepting text, html,
* or json.
*
* Options:
*
* - `showStack`, `stack` respond with both the error message and stack trace. Defaults to `false`
* - `showMessage`, `message`, respond with the exception message only. Defaults to `false`
* - `dumpExceptions`, `dump`, dump exceptions to stderr (without terminating the process). Defaults to `false`
*
* Text:
*
* By default, and when _text/plain_ is accepted a simple stack trace
* or error message will be returned.
*
* JSON:
*
* When _application/json_ is accepted, connect will respond with
* an object in the form of `{ "error": error }`.
*
* HTML:
*
* When accepted connect will output a nice html stack trace.
*
* @param {Object} options
* @return {Function}
* @api public
*/
exports = module.exports = function errorHandler(options){
options = options || {};
// defaults
var showStack = options.showStack || options.stack
, showMessage = options.showMessage || options.message
, dumpExceptions = options.dumpExceptions || options.dump
, formatUrl = options.formatUrl;
return function errorHandler(err, req, res, next){
res.statusCode = 500;
if (dumpExceptions) console.error(err.stack);
if (showStack) {
var accept = req.headers.accept || '';
// html
if (~accept.indexOf('html')) {
fs.readFile(__dirname + '/../public/style.css', 'utf8', function(e, style){
fs.readFile(__dirname + '/../public/error.html', 'utf8', function(e, html){
var stack = (err.stack || '')
.split('\n').slice(1)
.map(function(v){ return '<li>' + v + '</li>'; }).join('');
html = html
.replace('{style}', style)
.replace('{stack}', stack)
.replace('{title}', exports.title)
.replace(/\{error\}/g, utils.escape(err.toString()));
res.setHeader('Content-Type', 'text/html');
res.end(html);
});
});
// json
} else if (~accept.indexOf('json')) {
var json = JSON.stringify({ error: err });
res.setHeader('Content-Type', 'application/json');
res.end(json);
// plain text
} else {
res.writeHead(500, { 'Content-Type': 'text/plain' });
res.end(err.stack);
}
} else {
var body = showMessage
? err.toString()
: 'Internal Server Error';
res.setHeader('Content-Type', 'text/plain');
res.end(body);
}
};
};
/**
* Template title.
*/
exports.title = 'Connect';

View File

@@ -1,76 +0,0 @@
/*!
* Connect - favicon
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var fs = require('fs')
, utils = require('../utils');
/**
* Favicon cache.
*/
var icon;
/**
* By default serves the connect favicon, or the favicon
* located by the given `path`.
*
* Options:
*
* - `maxAge` cache-control max-age directive, defaulting to 1 day
*
* Examples:
*
* connect.createServer(
* connect.favicon()
* );
*
* connect.createServer(
* connect.favicon(__dirname + '/public/favicon.ico')
* );
*
* @param {String} path
* @param {Object} options
* @return {Function}
* @api public
*/
module.exports = function favicon(path, options){
var options = options || {}
, path = path || __dirname + '/../public/favicon.ico'
, maxAge = options.maxAge || 86400000;
return function favicon(req, res, next){
if ('/favicon.ico' == req.url) {
if (icon) {
res.writeHead(200, icon.headers);
res.end(icon.body);
} else {
fs.readFile(path, function(err, buf){
if (err) return next(err);
icon = {
headers: {
'Content-Type': 'image/x-icon'
, 'Content-Length': buf.length
, 'ETag': '"' + utils.md5(buf) + '"'
, 'Cache-Control': 'public, max-age=' + (maxAge / 1000)
},
body: buf
};
res.writeHead(200, icon.headers);
res.end(icon.body);
});
}
} else {
next();
}
};
};

View File

@@ -1,82 +0,0 @@
/*!
* Connect - limit
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Limit request bodies to the given size in `bytes`.
*
* A string representation of the bytesize may also be passed,
* for example "5mb", "200kb", "1gb", etc.
*
* Examples:
*
* var server = connect(
* connect.limit('5.5mb')
* ).listen(3000);
*
* TODO: pause EV_READ
*
* @param {Number|String} bytes
* @return {Function}
* @api public
*/
module.exports = function limit(bytes){
if ('string' == typeof bytes) bytes = parse(bytes);
if ('number' != typeof bytes) throw new Error('limit() bytes required');
return function limit(req, res, next){
var received = 0
, len = req.headers['content-length']
? parseInt(req.headers['content-length'], 10)
: null;
// deny the request
function deny() {
req.destroy();
}
// self-awareness
if (req._limit) return next();
req._limit = true;
// limit by content-length
if (len && len > bytes) {
res.statusCode = 413;
res.end('Request Entity Too Large');
return;
}
// limit
req.on('data', function(chunk){
received += chunk.length;
if (received > bytes) deny();
});
next();
};
};
/**
* Parse byte `size` string.
*
* @param {String} size
* @return {Number}
* @api private
*/
function parse(size) {
var parts = size.match(/^(\d+(?:\.\d+)?) *(kb|mb|gb)$/)
, n = parseFloat(parts[1])
, type = parts[2];
var map = {
kb: 1024
, mb: 1024 * 1024
, gb: 1024 * 1024 * 1024
};
return map[type] * n;
}

View File

@@ -1,299 +0,0 @@
/*!
* Connect - logger
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Log buffer.
*/
var buf = [];
/**
* Default log buffer duration.
*/
var defaultBufferDuration = 1000;
/**
* Log requests with the given `options` or a `format` string.
*
* Options:
*
* - `format` Format string, see below for tokens
* - `stream` Output stream, defaults to _stdout_
* - `buffer` Buffer duration, defaults to 1000ms when _true_
* - `immediate` Write log line on request instead of response (for response times)
*
* Tokens:
*
* - `:req[header]` ex: `:req[Accept]`
* - `:res[header]` ex: `:res[Content-Length]`
* - `:http-version`
* - `:response-time`
* - `:remote-addr`
* - `:date`
* - `:method`
* - `:url`
* - `:referrer`
* - `:user-agent`
* - `:status`
*
* Formats:
*
* Pre-defined formats that ship with connect:
*
* - `default` ':remote-addr - - [:date] ":method :url HTTP/:http-version" :status :res[content-length] ":referrer" ":user-agent"'
* - `short` ':remote-addr - :method :url HTTP/:http-version :status :res[content-length] - :response-time ms'
* - `tiny` ':method :url :status :res[content-length] - :response-time ms'
* - `dev` concise output colored by response status for development use
*
* Examples:
*
* connect.logger() // default
* connect.logger('short')
* connect.logger('tiny')
* connect.logger('dev')
* connect.logger(':method :url - :referrer')
* connect.logger(':req[content-type] -> :res[content-type]')
* connect.logger(function(req, res){ return 'some format string' })
*
* Defining Tokens:
*
* To define a token, simply invoke `connect.logger.token()` with the
* name and a callback function. The value returned is then available
* as ":type" in this case.
*
* connect.logger.token('type', function(req, res){ return req.headers['content-type']; })
*
* Defining Formats:
*
* All default formats are defined this way, however it's public API as well:
*
* connect.logger.format('name', 'string or function')
*
* @param {String|Function|Object} format or options
* @return {Function}
* @api public
*/
exports = module.exports = function logger(options) {
if ('object' == typeof options) {
options = options || {};
} else if (options) {
options = { format: options };
} else {
options = {};
}
// output on request instead of response
var immediate = options.immediate;
// format name
var fmt = exports[options.format] || options.format || exports.default;
// compile format
if ('function' != typeof fmt) fmt = compile(fmt);
// options
var stream = options.stream || process.stdout
, buffer = options.buffer;
// buffering support
if (buffer) {
var realStream = stream
, interval = 'number' == typeof buffer
? buffer
: defaultBufferDuration;
// flush interval
setInterval(function(){
if (buf.length) {
realStream.write(buf.join(''), 'ascii');
buf.length = 0;
}
}, interval);
// swap the stream
stream = {
write: function(str){
buf.push(str);
}
};
}
return function logger(req, res, next) {
req._startTime = new Date;
// mount safety
if (req._logging) return next();
// flag as logging
req._logging = true;
// immediate
if (immediate) {
var line = fmt(exports, req, res);
if (null == line) return;
stream.write(line + '\n', 'ascii');
} else {
// proxy end to output loggging
var end = res.end;
res.end = function(chunk, encoding){
res.end = end;
res.end(chunk, encoding);
var line = fmt(exports, req, res);
if (null == line) return;
stream.write(line + '\n', 'ascii');
};
}
next();
};
};
/**
* Compile `fmt` into a function.
*
* @param {String} fmt
* @return {Function}
* @api private
*/
function compile(fmt) {
fmt = fmt.replace(/"/g, '\\"');
var js = ' return "' + fmt.replace(/:([-\w]{2,})(?:\[([^\]]+)\])?/g, function(_, name, arg){
return '"\n + (tokens["' + name + '"](req, res, "' + arg + '") || "-") + "';
}) + '";'
return new Function('tokens, req, res', js);
};
/**
* Define a token function with the given `name`,
* and callback `fn(req, res)`.
*
* @param {String} name
* @param {Function} fn
* @return {Object} exports for chaining
* @api public
*/
exports.token = function(name, fn) {
exports[name] = fn;
return this;
};
/**
* Define a `fmt` with the given `name`.
*
* @param {String} name
* @param {String|Function} fmt
* @return {Object} exports for chaining
* @api public
*/
exports.format = function(name, str){
exports[name] = str;
return this;
};
// default format
exports.format('default', ':remote-addr - - [:date] ":method :url HTTP/:http-version" :status :res[content-length] ":referrer" ":user-agent"');
// short format
exports.format('short', ':remote-addr - :method :url HTTP/:http-version :status :res[content-length] - :response-time ms');
// tiny format
exports.format('tiny', ':method :url :status :res[content-length] - :response-time ms');
// dev (colored)
exports.format('dev', function(tokens, req, res){
var status = res.statusCode
, color = 32;
if (status >= 500) color = 31
else if (status >= 400) color = 33
else if (status >= 300) color = 36;
return '\033[90m' + req.method
+ ' ' + req.originalUrl + ' '
+ '\033[' + color + 'm' + res.statusCode
+ ' \033[90m'
+ (new Date - req._startTime)
+ 'ms\033[0m';
});
// request url
exports.token('url', function(req){
return req.originalUrl;
});
// request method
exports.token('method', function(req){
return req.method;
});
// response time in milliseconds
exports.token('response-time', function(req){
return new Date - req._startTime;
});
// UTC date
exports.token('date', function(){
return new Date().toUTCString();
});
// response status code
exports.token('status', function(req, res){
return res.statusCode;
});
// normalized referrer
exports.token('referrer', function(req){
return req.headers['referer'] || req.headers['referrer'];
});
// remote address
exports.token('remote-addr', function(req){
return req.socket && (req.socket.remoteAddress || (req.socket.socket && req.socket.socket.remoteAddress));
});
// HTTP version
exports.token('http-version', function(req){
return req.httpVersionMajor + '.' + req.httpVersionMinor;
});
// UA string
exports.token('user-agent', function(req){
return req.headers['user-agent'];
});
// request header
exports.token('req', function(req, res, field){
return req.headers[field.toLowerCase()];
});
// response header
exports.token('res', function(req, res, field){
return (res._headers || {})[field.toLowerCase()];
});

View File

@@ -1,38 +0,0 @@
/*!
* Connect - methodOverride
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Provides faux HTTP method support.
*
* Pass an optional `key` to use when checking for
* a method override, othewise defaults to _\_method_.
* The original method is available via `req.originalMethod`.
*
* @param {String} key
* @return {Function}
* @api public
*/
module.exports = function methodOverride(key){
key = key || "_method";
return function methodOverride(req, res, next) {
req.originalMethod = req.originalMethod || req.method;
// req.body
if (req.body && key in req.body) {
req.method = req.body[key].toUpperCase();
delete req.body[key];
// check X-HTTP-Method-Override
} else if (req.headers['x-http-method-override']) {
req.method = req.headers['x-http-method-override'].toUpperCase();
}
next();
};
};

View File

@@ -1,100 +0,0 @@
/*!
* Connect - profiler
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Profile the duration of a request.
*
* Typically this middleware should be utilized
* _above_ all others, as it proxies the `res.end()`
* method, being first allows it to encapsulate all
* other middleware.
*
* Example Output:
*
* GET /
* response time 2ms
* memory rss 52.00kb
* memory vsize 2.07mb
* heap before 3.76mb / 8.15mb
* heap after 3.80mb / 8.15mb
*
* @api public
*/
module.exports = function profiler(){
return function(req, res, next){
var end = res.end
, start = snapshot();
// state snapshot
function snapshot() {
return {
mem: process.memoryUsage()
, time: new Date
};
}
// proxy res.end()
res.end = function(data, encoding){
res.end = end;
res.end(data, encoding);
compare(req, start, snapshot())
};
next();
}
};
/**
* Compare `start` / `end` snapshots.
*
* @param {IncomingRequest} req
* @param {Object} start
* @param {Object} end
* @api private
*/
function compare(req, start, end) {
console.log();
row(req.method, req.url);
row('response time:', (end.time - start.time) + 'ms');
row('memory rss:', formatBytes(end.mem.rss - start.mem.rss));
row('memory vsize:', formatBytes(end.mem.vsize - start.mem.vsize));
row('heap before:', formatBytes(start.mem.heapUsed) + ' / ' + formatBytes(start.mem.heapTotal));
row('heap after:', formatBytes(end.mem.heapUsed) + ' / ' + formatBytes(end.mem.heapTotal));
console.log();
}
/**
* Row helper
*
* @param {String} key
* @param {String} val
* @api private
*/
function row(key, val) {
console.log(' \033[90m%s\033[0m \033[36m%s\033[0m', key, val);
}
/**
* Format byte-size.
*
* @param {Number} bytes
* @return {String}
* @api private
*/
function formatBytes(bytes) {
var kb = 1024
, mb = 1024 * kb
, gb = 1024 * mb;
if (bytes < kb) return bytes + 'b';
if (bytes < mb) return (bytes / kb).toFixed(2) + 'kb';
if (bytes < gb) return (bytes / mb).toFixed(2) + 'mb';
return (bytes / gb).toFixed(2) + 'gb';
};

View File

@@ -1,40 +0,0 @@
/*!
* Connect - query
* Copyright(c) 2011 TJ Holowaychuk
* Copyright(c) 2011 Sencha Inc.
* MIT Licensed
*/
/**
* Module dependencies.
*/
var qs = require('qs')
, parse = require('url').parse;
/**
* Automatically parse the query-string when available,
* populating the `req.query` object.
*
* Examples:
*
* connect(
* connect.query()
* , function(req, res){
* res.end(JSON.stringify(req.query));
* }
* ).listen(3000);
*
* @return {Function}
* @api public
*/
module.exports = function query(){
return function query(req, res, next){
req.query = ~req.url.indexOf('?')
? qs.parse(parse(req.url).query)
: {};
next();
};
};

View File

@@ -1,34 +0,0 @@
/*!
* Connect - responseTime
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Adds the `X-Response-Time` header displaying the response
* duration in milliseconds.
*
* @return {Function}
* @api public
*/
module.exports = function responseTime(){
return function(req, res, next){
var writeHead = res.writeHead
, start = new Date;
if (res._responseTime) return next();
res._responseTime = true;
// proxy writeHead to calculate duration
res.writeHead = function(status, headers){
var duration = new Date - start;
res.setHeader('X-Response-Time', duration + 'ms');
res.writeHead = writeHead;
res.writeHead(status, headers);
};
next();
};
};

View File

@@ -1,379 +0,0 @@
/*!
* Connect - router
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var utils = require('../utils')
, parse = require('url').parse;
/**
* Expose router.
*/
exports = module.exports = router;
/**
* Supported HTTP / WebDAV methods.
*/
var _methods = exports.methods = [
'get'
, 'post'
, 'put'
, 'delete'
, 'connect'
, 'options'
, 'trace'
, 'copy'
, 'lock'
, 'mkcol'
, 'move'
, 'propfind'
, 'proppatch'
, 'unlock'
, 'report'
, 'mkactivity'
, 'checkout'
, 'merge'
];
/**
* Provides Sinatra and Express-like routing capabilities.
*
* Examples:
*
* connect.router(function(app){
* app.get('/user/:id', function(req, res, next){
* // populates req.params.id
* });
* app.put('/user/:id', function(req, res, next){
* // populates req.params.id
* });
* })
*
* @param {Function} fn
* @return {Function}
* @api public
*/
function router(fn){
var self = this
, methods = {}
, routes = {}
, params = {};
if (!fn) throw new Error('router provider requires a callback function');
// Generate method functions
_methods.forEach(function(method){
methods[method] = generateMethodFunction(method.toUpperCase());
});
// Alias del -> delete
methods.del = methods.delete;
// Apply callback to all methods
methods.all = function(){
var args = arguments;
_methods.forEach(function(name){
methods[name].apply(this, args);
});
return self;
};
// Register param callback
methods.param = function(name, fn){
params[name] = fn;
};
fn.call(this, methods);
function generateMethodFunction(name) {
var localRoutes = routes[name] = routes[name] || [];
return function(path, fn){
var keys = []
, middleware = [];
// slice middleware
if (arguments.length > 2) {
middleware = Array.prototype.slice.call(arguments, 1, arguments.length);
fn = middleware.pop();
middleware = utils.flatten(middleware);
}
fn.middleware = middleware;
if (!path) throw new Error(name + ' route requires a path');
if (!fn) throw new Error(name + ' route ' + path + ' requires a callback');
var regexp = path instanceof RegExp
? path
: normalizePath(path, keys);
localRoutes.push({
fn: fn
, path: regexp
, keys: keys
, orig: path
, method: name
});
return self;
};
}
function router(req, res, next){
var route
, self = this;
(function pass(i){
if (route = match(req, routes, i)) {
var i = 0
, keys = route.keys;
req.params = route.params;
// Param preconditions
(function param(err) {
try {
var key = keys[i++]
, val = req.params[key]
, fn = params[key];
if ('route' == err) {
pass(req._route_index + 1);
// Error
} else if (err) {
next(err);
// Param has callback
} else if (fn) {
// Return style
if (1 == fn.length) {
req.params[key] = fn(val);
param();
// Middleware style
} else {
fn(req, res, param, val);
}
// Finished processing params
} else if (!key) {
// route middleware
i = 0;
(function nextMiddleware(err){
var fn = route.middleware[i++];
if ('route' == err) {
pass(req._route_index + 1);
} else if (err) {
next(err);
} else if (fn) {
fn(req, res, nextMiddleware);
} else {
route.call(self, req, res, function(err){
if (err) {
next(err);
} else {
pass(req._route_index + 1);
}
});
}
})();
// More params
} else {
param();
}
} catch (err) {
next(err);
}
})();
} else if ('OPTIONS' == req.method) {
options(req, res, routes);
} else {
next();
}
})();
};
router.remove = function(path, method){
var fns = router.lookup(path, method);
fns.forEach(function(fn){
routes[fn.method].splice(fn.index, 1);
});
};
router.lookup = function(path, method, ret){
ret = ret || [];
// method specific lookup
if (method) {
method = method.toUpperCase();
if (routes[method]) {
routes[method].forEach(function(route, i){
if (path == route.orig) {
var fn = route.fn;
fn.regexp = route.path;
fn.keys = route.keys;
fn.path = route.orig;
fn.method = route.method;
fn.index = i;
ret.push(fn);
}
});
}
// global lookup
} else {
_methods.forEach(function(method){
router.lookup(path, method, ret);
});
}
return ret;
};
router.match = function(url, method, ret){
var ret = ret || []
, i = 0
, fn
, req;
// method specific matches
if (method) {
method = method.toUpperCase();
req = { url: url, method: method };
while (fn = match(req, routes, i)) {
i = req._route_index + 1;
ret.push(fn);
}
// global matches
} else {
_methods.forEach(function(method){
router.match(url, method, ret);
});
}
return ret;
};
return router;
}
/**
* Respond to OPTIONS.
*
* @param {ServerRequest} req
* @param {ServerResponse} req
* @param {Array} routes
* @api private
*/
function options(req, res, routes) {
var pathname = parse(req.url).pathname
, body = optionsFor(pathname, routes).join(',');
res.writeHead(200, {
'Content-Length': body.length
, 'Allow': body
});
res.end(body);
}
/**
* Return OPTIONS array for the given `path`, matching `routes`.
*
* @param {String} path
* @param {Array} routes
* @return {Array}
* @api private
*/
function optionsFor(path, routes) {
return _methods.filter(function(method){
var arr = routes[method.toUpperCase()];
for (var i = 0, len = arr.length; i < len; ++i) {
if (arr[i].path.test(path)) return true;
}
}).map(function(method){
return method.toUpperCase();
});
}
/**
* Normalize the given path string,
* returning a regular expression.
*
* An empty array should be passed,
* which will contain the placeholder
* key names. For example "/user/:id" will
* then contain ["id"].
*
* @param {String} path
* @param {Array} keys
* @return {RegExp}
* @api private
*/
function normalizePath(path, keys) {
path = path
.concat('/?')
.replace(/\/\(/g, '(?:/')
.replace(/(\/)?(\.)?:(\w+)(?:(\(.*?\)))?(\?)?/g, function(_, slash, format, key, capture, optional){
keys.push(key);
slash = slash || '';
return ''
+ (optional ? '' : slash)
+ '(?:'
+ (optional ? slash : '')
+ (format || '') + (capture || '([^/]+?)') + ')'
+ (optional || '');
})
.replace(/([\/.])/g, '\\$1')
.replace(/\*/g, '(.+)');
return new RegExp('^' + path + '$', 'i');
}
/**
* Attempt to match the given request to
* one of the routes. When successful
* a route function is returned.
*
* @param {ServerRequest} req
* @param {Object} routes
* @return {Function}
* @api private
*/
function match(req, routes, i) {
var captures
, method = req.method
, i = i || 0;
if ('HEAD' == method) method = 'GET';
if (routes = routes[method]) {
var url = parse(req.url)
, pathname = url.pathname;
for (var len = routes.length; i < len; ++i) {
var route = routes[i]
, fn = route.fn
, path = route.path
, keys = fn.keys = route.keys;
if (captures = path.exec(pathname)) {
fn.method = method;
fn.params = [];
for (var j = 1, len = captures.length; j < len; ++j) {
var key = keys[j-1],
val = typeof captures[j] === 'string'
? decodeURIComponent(captures[j])
: captures[j];
if (key) {
fn.params[key] = val;
} else {
fn.params.push(val);
}
}
req._route_index = i;
return fn;
}
}
}
}

View File

@@ -1,346 +0,0 @@
/*!
* Connect - session
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var Session = require('./session/session')
, MemoryStore = require('./session/memory')
, Cookie = require('./session/cookie')
, Store = require('./session/store')
, utils = require('./../utils')
, parse = require('url').parse
, crypto = require('crypto');
// environment
var env = process.env.NODE_ENV;
/**
* Expose the middleware.
*/
exports = module.exports = session;
/**
* Expose constructors.
*/
exports.Store = Store;
exports.Cookie = Cookie;
exports.Session = Session;
exports.MemoryStore = MemoryStore;
/**
* Warning message for `MemoryStore` usage in production.
*/
var warning = 'Warning: connection.session() MemoryStore is not\n'
+ 'designed for a production environment, as it will leak\n'
+ 'memory, and obviously only work within a single process.';
/**
* Default finger-printing function.
*/
function defaultFingerprint(req) {
var ua = req.headers['user-agent'] || '';
return ua.replace(/;?\schromeframe\/[\d\.]+/, '');
};
/**
* Paths to ignore.
*/
exports.ignore = [];
/**
* Setup session store with the given `options`.
*
* Session data is _not_ saved in the cookie itself, however
* cookies are used, so we must use the [cookieParser()](middleware-cookieParser.html)
* middleware _before_ `session()`.
*
* Examples:
*
* connect.createServer(
* connect.cookieParser()
* , connect.session({ secret: 'keyboard cat' })
* );
*
* Options:
*
* - `key` cookie name defaulting to `connect.sid`
* - `store` Session store instance
* - `fingerprint` Custom fingerprint generating function
* - `cookie` Session cookie settings, defaulting to `{ path: '/', httpOnly: true, maxAge: 14400000 }`
* - `secret` Secret string used to compute hash
*
* Ignore Paths:
*
* By default `/favicon.ico` is the only ignored path, all others
* will utilize sessions, to manipulate the paths ignored, use
* `connect.session.ignore.push('/my/path')`. This works for _full_
* pathnames only, not segments nor substrings.
*
* connect.session.ignore.push('/robots.txt');
*
* ## req.session
*
* To store or access session data, simply use the request property `req.session`,
* which is (generally) serialized as JSON by the store, so nested objects
* are typically fine. For example below is a user-specific view counter:
*
* connect(
* connect.cookieParser()
* , connect.session({ secret: 'keyboard cat', cookie: { maxAge: 60000 }})
* , connect.favicon()
* , function(req, res, next){
* var sess = req.session;
* if (sess.views) {
* res.setHeader('Content-Type', 'text/html');
* res.write('<p>views: ' + sess.views + '</p>');
* res.write('<p>expires in: ' + (sess.cookie.maxAge / 1000) + 's</p>');
* res.end();
* sess.views++;
* } else {
* sess.views = 1;
* res.end('welcome to the session demo. refresh!');
* }
* }
* ).listen(3000);
*
* ## Session#regenerate()
*
* To regenerate the session simply invoke the method, once complete
* a new SID and `Session` instance will be initialized at `req.session`.
*
* req.session.regenerate(function(err){
* // will have a new session here
* });
*
* ## Session#destroy()
*
* Destroys the session, removing `req.session`, will be re-generated next request.
*
* req.session.destroy(function(err){
* // cannot access session here
* });
*
* ## Session#reload()
*
* Reloads the session data.
*
* req.session.reload(function(err){
* // session updated
* });
*
* ## Session#save()
*
* Save the session.
*
* req.session.save(function(err){
* // session saved
* });
*
* ## Session#touch()
*
* Updates the `.maxAge`, and `.lastAccess` properties. Typically this is
* not necessary to call, as the session middleware does this for you.
*
* ## Session#cookie
*
* Each session has a unique cookie object accompany it. This allows
* you to alter the session cookie per visitor. For example we can
* set `req.session.cookie.expires` to `false` to enable the cookie
* to remain for only the duration of the user-agent.
*
* ## Session#maxAge
*
* Alternatively `req.session.cookie.maxAge` will return the time
* remaining in milliseconds, which we may also re-assign a new value
* to adjust the `.expires` property appropriately. The following
* are essentially equivalent
*
* var hour = 3600000;
* req.session.cookie.expires = new Date(Date.now() + hour);
* req.session.cookie.maxAge = hour;
*
* For example when `maxAge` is set to `60000` (one minute), and 30 seconds
* has elapsed it will return `30000` until the current request has completed,
* at which time `req.session.touch()` is called to update `req.session.lastAccess`,
* and reset `req.session.maxAge` to its original value.
*
* req.session.cookie.maxAge;
* // => 30000
*
* Session Store Implementation:
*
* Every session store _must_ implement the following methods
*
* - `.get(sid, callback)`
* - `.set(sid, session, callback)`
* - `.destroy(sid, callback)`
*
* Recommended methods include, but are not limited to:
*
* - `.length(callback)`
* - `.clear(callback)`
*
* For an example implementation view the [connect-redis](http://github.com/visionmedia/connect-redis) repo.
*
* @param {Object} options
* @return {Function}
* @api public
*/
function session(options){
var options = options || {}
, key = options.key || 'connect.sid'
, secret = options.secret
, store = options.store || new MemoryStore
, fingerprint = options.fingerprint || defaultFingerprint
, cookie = options.cookie;
// notify user that this store is not
// meant for a production environment
if ('production' == env && store instanceof MemoryStore) {
console.warn(warning);
}
// ensure secret is present
if (!secret) {
throw new Error('connect.session({ secret: "string" }) required for security');
}
// session hashing function
store.hash = function(req, base) {
return crypto
.createHmac('sha256', secret)
.update(base + fingerprint(req))
.digest('base64')
.replace(/=*$/, '');
};
// generates the new session
store.generate = function(req){
var base = utils.uid(24);
var sessionID = base + '.' + store.hash(req, base);
req.sessionID = sessionID;
req.session = new Session(req);
req.session.cookie = new Cookie(cookie);
};
return function session(req, res, next) {
// self-awareness
if (req.session) return next();
// parse url
var url = parse(req.url)
, path = url.pathname;
// ignorable paths
if (~exports.ignore.indexOf(path)) return next();
// expose store
req.sessionStore = store;
// proxy writeHead() to Set-Cookie
var writeHead = res.writeHead;
res.writeHead = function(status, headers){
if (req.session) {
var cookie = req.session.cookie;
// only send secure session cookies when there is a secure connection.
// proxySecure is a custom attribute to allow for a reverse proxy
// to handle SSL connections and to communicate to connect over HTTP that
// the incoming connection is secure.
var secured = cookie.secure && (req.connection.encrypted || req.connection.proxySecure);
if (secured || !cookie.secure) {
res.setHeader('Set-Cookie', cookie.serialize(key, req.sessionID));
}
}
res.writeHead = writeHead;
return res.writeHead(status, headers);
};
// proxy end() to commit the session
var end = res.end;
res.end = function(data, encoding){
res.end = end;
if (req.session) {
// HACK: ensure Set-Cookie for implicit writeHead()
if (!res._header) res._implicitHeader();
req.session.resetMaxAge();
req.session.save(function(){
res.end(data, encoding);
});
} else {
res.end(data, encoding);
}
};
// session hashing
function hash(base) {
return store.hash(req, base);
}
// generate the session
function generate() {
store.generate(req);
}
// get the sessionID from the cookie
req.sessionID = req.cookies[key];
// make a new session if the browser doesn't send a sessionID
if (!req.sessionID) {
generate();
next();
return;
}
// check the fingerprint
var parts = req.sessionID.split('.');
if (parts[1] != hash(parts[0])) {
generate();
next();
return;
}
// generate the session object
var pause = utils.pause(req);
store.get(req.sessionID, function(err, sess){
// proxy to resume() events
var _next = next;
next = function(err){
_next(err);
pause.resume();
}
// error handling
if (err) {
if ('ENOENT' == err.code) {
generate();
next();
} else {
next(err);
}
// no session
} else if (!sess) {
generate();
next();
// populate req.session
} else {
store.createSession(req, sess);
next();
}
});
};
};

View File

@@ -1,126 +0,0 @@
/*!
* Connect - session - Cookie
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var utils = require('../../utils');
/**
* Initialize a new `Cookie` with the given `options`.
*
* @param {Object} options
* @api private
*/
var Cookie = module.exports = function Cookie(options) {
this.path = '/';
this.httpOnly = true;
this.maxAge = 14400000;
if (options) utils.merge(this, options);
this.originalMaxAge = undefined == this.originalMaxAge
? this.maxAge
: this.originalMaxAge;
};
/**
* Prototype.
*/
Cookie.prototype = {
/**
* Set expires `date`.
*
* @param {Date} date
* @api public
*/
set expires(date) {
this._expires = date;
this.originalMaxAge = this.maxAge;
},
/**
* Get expires `date`.
*
* @return {Date}
* @api public
*/
get expires() {
return this._expires;
},
/**
* Set expires via max-age in `ms`.
*
* @param {Number} ms
* @api public
*/
set maxAge(ms) {
this.expires = 'number' == typeof ms
? new Date(Date.now() + ms)
: ms;
},
/**
* Get expires max-age in `ms`.
*
* @return {Number}
* @api public
*/
get maxAge() {
return this.expires instanceof Date
? this.expires.valueOf() - Date.now()
: this.expires;
},
/**
* Return cookie data object.
*
* @return {Object}
* @api private
*/
get data() {
return {
originalMaxAge: this.originalMaxAge
, expires: this._expires
, secure: this.secure
, httpOnly: this.httpOnly
, domain: this.domain
, path: this.path
}
},
/**
* Return a serialized cookie string.
*
* @return {String}
* @api public
*/
serialize: function(name, val){
return utils.serializeCookie(name, val, this.data);
},
/**
* Return JSON representation of this cookie.
*
* @return {Object}
* @api private
*/
toJSON: function(){
return this.data;
}
};

View File

@@ -1,131 +0,0 @@
/*!
* Connect - session - MemoryStore
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var Store = require('./store')
, utils = require('../../utils')
, Session = require('./session');
/**
* Initialize a new `MemoryStore`.
*
* @api public
*/
var MemoryStore = module.exports = function MemoryStore() {
this.sessions = {};
};
/**
* Inherit from `Store.prototype`.
*/
MemoryStore.prototype.__proto__ = Store.prototype;
/**
* Attempt to fetch session by the given `sid`.
*
* @param {String} sid
* @param {Function} fn
* @api public
*/
MemoryStore.prototype.get = function(sid, fn){
var self = this;
process.nextTick(function(){
var expires
, sess = self.sessions[sid];
if (sess) {
sess = JSON.parse(sess);
expires = 'string' == typeof sess.cookie.expires
? new Date(sess.cookie.expires)
: sess.cookie.expires;
if (!expires || new Date < expires) {
fn(null, sess);
} else {
self.destroy(sid, fn);
}
} else {
fn();
}
});
};
/**
* Commit the given `sess` object associated with the given `sid`.
*
* @param {String} sid
* @param {Session} sess
* @param {Function} fn
* @api public
*/
MemoryStore.prototype.set = function(sid, sess, fn){
var self = this;
process.nextTick(function(){
self.sessions[sid] = JSON.stringify(sess);
fn && fn();
});
};
/**
* Destroy the session associated with the given `sid`.
*
* @param {String} sid
* @api public
*/
MemoryStore.prototype.destroy = function(sid, fn){
var self = this;
process.nextTick(function(){
delete self.sessions[sid];
fn && fn();
});
};
/**
* Invoke the given callback `fn` with all active sessions.
*
* @param {Function} fn
* @api public
*/
MemoryStore.prototype.all = function(fn){
var arr = []
, keys = Object.keys(this.sessions);
for (var i = 0, len = keys.length; i < len; ++i) {
arr.push(this.sessions[keys[i]]);
}
fn(null, arr);
};
/**
* Clear all sessions.
*
* @param {Function} fn
* @api public
*/
MemoryStore.prototype.clear = function(fn){
this.sessions = {};
fn && fn();
};
/**
* Fetch number of sessions.
*
* @param {Function} fn
* @api public
*/
MemoryStore.prototype.length = function(fn){
fn(null, Object.keys(this.sessions).length);
};

View File

@@ -1,137 +0,0 @@
/*!
* Connect - session - Session
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var utils = require('../../utils')
, Cookie = require('./cookie');
/**
* Create a new `Session` with the given request and `data`.
*
* @param {IncomingRequest} req
* @param {Object} data
* @api private
*/
var Session = module.exports = function Session(req, data) {
Object.defineProperty(this, 'req', { value: req });
Object.defineProperty(this, 'id', { value: req.sessionID });
if ('object' == typeof data) {
utils.merge(this, data);
} else {
this.lastAccess = Date.now();
}
};
/**
* Update `.lastAccess` timestamp,
* and reset `.cookie.maxAge` to prevent
* the cookie from expiring when the
* session is still active.
*
* @return {Session} for chaining
* @api public
*/
Session.prototype.touch = function(){
return this
.resetLastAccess()
.resetMaxAge();
};
/**
* Update `.lastAccess` timestamp.
*
* @return {Session} for chaining
* @api public
*/
Session.prototype.resetLastAccess = function(){
this.lastAccess = Date.now();
return this;
};
/**
* Reset `.maxAge` to `.originalMaxAge`.
*
* @return {Session} for chaining
* @api public
*/
Session.prototype.resetMaxAge = function(){
this.cookie.maxAge = this.cookie.originalMaxAge;
return this;
};
/**
* Save the session data with optional callback `fn(err)`.
*
* @param {Function} fn
* @return {Session} for chaining
* @api public
*/
Session.prototype.save = function(fn){
this.req.sessionStore.set(this.id, this, fn || function(){});
return this;
};
/**
* Re-loads the session data _without_ altering
* the maxAge or lastAccess properties. Invokes the
* callback `fn(err)`, after which time if no exception
* has occurred the `req.session` property will be
* a new `Session` object, although representing the
* same session.
*
* @param {Function} fn
* @return {Session} for chaining
* @api public
*/
Session.prototype.reload = function(fn){
var req = this.req
, store = this.req.sessionStore;
store.get(this.id, function(err, sess){
if (err) return fn(err);
if (!sess) return fn(new Error('failed to load session'));
store.createSession(req, sess);
fn();
});
return this;
};
/**
* Destroy `this` session.
*
* @param {Function} fn
* @return {Session} for chaining
* @api public
*/
Session.prototype.destroy = function(fn){
delete this.req.session;
this.req.sessionStore.destroy(this.id, fn);
return this;
};
/**
* Regenerate this request's session.
*
* @param {Function} fn
* @return {Session} for chaining
* @api public
*/
Session.prototype.regenerate = function(fn){
this.req.sessionStore.regenerate(this.req, fn);
return this;
};

View File

@@ -1,87 +0,0 @@
/*!
* Connect - session - Store
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var EventEmitter = require('events').EventEmitter
, Session = require('./session')
, Cookie = require('./cookie')
, utils = require('../../utils');
/**
* Initialize abstract `Store`.
*
* @api private
*/
var Store = module.exports = function Store(options){};
/**
* Inherit from `EventEmitter.prototype`.
*/
Store.prototype.__proto__ = EventEmitter.prototype;
/**
* Re-generate the given requests's session.
*
* @param {IncomingRequest} req
* @return {Function} fn
* @api public
*/
Store.prototype.regenerate = function(req, fn){
var self = this;
this.destroy(req.sessionID, function(err){
self.generate(req);
fn(err);
});
};
/**
* Load a `Session` instance via the given `sid`
* and invoke the callback `fn(err, sess)`.
*
* @param {String} sid
* @param {Function} fn
* @api public
*/
Store.prototype.load = function(sid, fn){
var self = this;
this.get(sid, function(err, sess){
if (err) return fn(err);
if (!sess) return fn();
var req = { sessionID: sid, sessionStore: self };
sess = self.createSession(req, sess, false);
fn(null, sess);
});
};
/**
* Create session from JSON `sess` data.
*
* @param {IncomingRequest} req
* @param {Object} sess
* @return {Session}
* @api private
*/
Store.prototype.createSession = function(req, sess, update){
var expires = sess.cookie.expires
, orig = sess.cookie.originalMaxAge
, update = null == update ? true : false;
sess.cookie = new Cookie(sess.cookie);
if ('string' == typeof expires) sess.cookie.expires = new Date(expires);
sess.cookie.originalMaxAge = orig;
req.session = new Session(req, sess);
if (update) req.session.resetLastAccess();
return req.session;
};

View File

@@ -1,225 +0,0 @@
/*!
* Connect - staticProvider
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var fs = require('fs')
, path = require('path')
, join = path.join
, basename = path.basename
, normalize = path.normalize
, utils = require('../utils')
, Buffer = require('buffer').Buffer
, parse = require('url').parse
, mime = require('mime');
/**
* Static file server with the given `root` path.
*
* Examples:
*
* var oneDay = 86400000;
*
* connect(
* connect.static(__dirname + '/public')
* ).listen(3000);
*
* connect(
* connect.static(__dirname + '/public', { maxAge: oneDay })
* ).listen(3000);
*
* Options:
*
* - `maxAge` Browser cache maxAge in milliseconds. defaults to 0
* - `hidden` Allow transfer of hidden files. defaults to false
* - `redirect` Redirect to trailing "/" when the pathname is a dir
*
* @param {String} root
* @param {Object} options
* @return {Function}
* @api public
*/
exports = module.exports = function static(root, options){
options = options || {};
// root required
if (!root) throw new Error('static() root path required');
options.root = root;
return function static(req, res, next) {
options.path = req.url;
options.getOnly = true;
send(req, res, next, options);
};
};
/**
* Expose mime module.
*/
exports.mime = mime;
/**
* Respond with 416 "Requested Range Not Satisfiable"
*
* @param {ServerResponse} res
* @api private
*/
function invalidRange(res) {
var body = 'Requested Range Not Satisfiable';
res.setHeader('Content-Type', 'text/plain');
res.setHeader('Content-Length', body.length);
res.statusCode = 416;
res.end(body);
}
/**
* Attempt to tranfer the requseted file to `res`.
*
* @param {ServerRequest}
* @param {ServerResponse}
* @param {Function} next
* @param {Object} options
* @api private
*/
var send = exports.send = function(req, res, next, options){
options = options || {};
if (!options.path) throw new Error('path required');
// setup
var maxAge = options.maxAge || 0
, ranges = req.headers.range
, head = 'HEAD' == req.method
, get = 'GET' == req.method
, root = options.root ? normalize(options.root) : null
, redirect = false === options.redirect ? false : true
, getOnly = options.getOnly
, fn = options.callback
, hidden = options.hidden
, done;
// replace next() with callback when available
if (fn) next = fn;
// ignore non-GET requests
if (getOnly && !get && !head) return next();
// parse url
var url = parse(options.path)
, path = decodeURIComponent(url.pathname)
, type;
// null byte(s)
if (~path.indexOf('\0')) return utils.badRequest(res);
// when root is not given, consider .. malicious
if (!root && ~path.indexOf('..')) return utils.forbidden(res);
// join / normalize from optional root dir
path = normalize(join(root, path));
// malicious path
if (root && 0 != path.indexOf(root)) return fn
? fn(new Error('Forbidden'))
: utils.forbidden(res);
// index.html support
if (normalize('/') == path[path.length - 1]) path += 'index.html';
// "hidden" file
if (!hidden && '.' == basename(path)[0]) return next();
fs.stat(path, function(err, stat){
// mime type
type = mime.lookup(path);
// ignore ENOENT
if (err) {
if (fn) return fn(err);
return 'ENOENT' == err.code
? next()
: next(err);
// redirect directory in case index.html is present
} else if (stat.isDirectory()) {
if (!redirect) return next();
res.statusCode = 301;
res.setHeader('Location', url.pathname + '/');
res.end('Redirecting to ' + url.pathname + '/');
return;
}
// header fields
if (!res.getHeader('Date')) res.setHeader('Date', new Date().toUTCString());
if (!res.getHeader('Cache-Control')) res.setHeader('Cache-Control', 'public, max-age=' + (maxAge / 1000));
if (!res.getHeader('Last-Modified')) res.setHeader('Last-Modified', stat.mtime.toUTCString());
if (!res.getHeader('ETag')) res.setHeader('ETag', utils.etag(stat));
if (!res.getHeader('content-type')) {
var charset = mime.charsets.lookup(type);
res.setHeader('Content-Type', type + (charset ? '; charset=' + charset : ''));
}
res.setHeader('Accept-Ranges', 'bytes');
// conditional GET support
if (utils.conditionalGET(req)) {
if (!utils.modified(req, res)) {
req.emit('static');
return utils.notModified(res);
}
}
var opts = {};
var chunkSize = stat.size;
// we have a Range request
if (ranges) {
ranges = utils.parseRange(stat.size, ranges);
// valid
if (ranges) {
// TODO: stream options
// TODO: multiple support
opts.start = ranges[0].start;
opts.end = ranges[0].end;
chunkSize = opts.end - opts.start + 1;
res.statusCode = 206;
res.setHeader('Content-Range', 'bytes '
+ opts.start
+ '-'
+ opts.end
+ '/'
+ stat.size);
// invalid
} else {
return fn
? fn(new Error('Requested Range Not Satisfiable'))
: invalidRange(res);
}
}
res.setHeader('Content-Length', chunkSize);
// transfer
if (head) return res.end();
// stream
var stream = fs.createReadStream(path, opts);
req.emit('static', stream);
stream.pipe(res);
// callback
if (fn) {
function callback(err) { done || fn(err); done = true }
req.on('close', callback);
stream.on('end', callback);
}
});
};

View File

@@ -1,175 +0,0 @@
/*!
* Connect - staticCache
* Copyright(c) 2011 Sencha Inc.
* MIT Licensed
*/
/**
* Module dependencies.
*/
var http = require('http')
, utils = require('../utils')
, Cache = require('../cache')
, url = require('url')
, fs = require('fs');
/**
* Enables a memory cache layer on top of
* the `static()` middleware, serving popular
* static files.
*
* By default a maximum of 128 objects are
* held in cache, with a max of 256k each,
* totalling ~32mb.
*
* A Least-Recently-Used (LRU) cache algo
* is implemented through the `Cache` object,
* simply rotating cache objects as they are
* hit. This means that increasingly popular
* objects maintain their positions while
* others get shoved out of the stack and
* garbage collected.
*
* Benchmarks:
*
* static(): 2700 rps
* node-static: 5300 rps
* static() + staticCache(): 7500 rps
*
* Options:
*
* - `maxObjects` max cache objects [128]
* - `maxLength` max cache object length 256kb
*
* @param {Type} name
* @return {Type}
* @api public
*/
module.exports = function staticCache(options){
var options = options || {}
, cache = new Cache(options.maxObjects || 128)
, maxlen = options.maxLength || 1024 * 256;
return function staticCache(req, res, next){
var path = url.parse(req.url).pathname
, ranges = req.headers.range
, hit = cache.get(path)
, hitCC
, uaCC
, header
, age;
// cache static
req.on('static', function(stream){
var headers = res._headers
, cc = utils.parseCacheControl(headers['cache-control'] || '')
, contentLength = headers['content-length']
, hit;
// ignore larger files
if (!contentLength || contentLength > maxlen) return;
// dont cache items we shouldn't be
if ( cc['no-cache']
|| cc['no-store']
|| cc['private']
|| cc['must-revalidate']) return;
// if already in cache then validate
if (hit = cache.get(path)){
if (headers.etag == hit[0].etag) {
hit[0].date = new Date;
return;
} else {
cache.remove(path);
}
}
// validation notifiactions don't contain a steam
if (null == stream) return;
// add the cache object
var arr = cache.add(path);
arr.push(headers);
// store the chunks
stream.on('data', function(chunk){
arr.push(chunk);
});
// flag it as complete
stream.on('end', function(){
arr.complete = true;
});
});
// cache hit, doesnt support range requests
if (hit && hit.complete && !ranges) {
header = utils.merge({}, hit[0]);
header.Age = age = (new Date - new Date(header.date)) / 1000 | 0;
header.date = new Date().toUTCString();
// parse cache-controls
hitCC = utils.parseCacheControl(header['cache-control'] || '');
uaCC = utils.parseCacheControl(req.headers['cache-control'] || '');
// check if we must revalidate(bypass)
if (hitCC['no-cache'] || uaCC['no-cache']) return next();
// check freshness of entity
if (isStale(hitCC, age) || isStale(uaCC, age)) return next();
// conditional GET support
if (utils.conditionalGET(req)) {
if (!utils.modified(req, res, header)) {
header['content-length'] = 0;
res.writeHead(304, header);
return res.end();
}
}
// HEAD support
if ('HEAD' == req.method) {
header['content-length'] = 0;
res.writeHead(200, header);
return res.end();
}
// respond with cache
res.writeHead(200, header);
// backpressure
function write(i) {
var buf = hit[i];
if (!buf) return res.end();
if (false === res.write(buf)) {
res.once('drain', function(){
write(++i);
});
} else {
write(++i);
}
}
return write(1);
}
next();
}
};
/**
* Check if cache item is stale
*
* @param {Object} cc
* @param {Number} age
* @return {Boolean}
* @api private
*/
function isStale(cc, age) {
return cc['max-age'] && cc['max-age'] <= age;
}

View File

@@ -1,44 +0,0 @@
/*!
* Connect - vhost
* Copyright(c) 2010 Sencha Inc.
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Setup vhost for the given `hostname` and `server`.
*
* Examples:
*
* connect(
* connect.vhost('foo.com',
* connect.createServer(...middleware...)
* ),
* connect.vhost('bar.com',
* connect.createServer(...middleware...)
* )
* );
*
* @param {String} hostname
* @param {Server} server
* @return {Function}
* @api public
*/
module.exports = function vhost(hostname, server){
if (!hostname) throw new Error('vhost hostname required');
if (!server) throw new Error('vhost server required');
var regexp = new RegExp('^' + hostname.replace(/[*]/g, '(.*?)') + '$');
if (server.onvhost) server.onvhost(hostname);
return function vhost(req, res, next){
if (!req.headers.host) return next();
var host = req.headers.host.split(':')[0];
if (req.subdomains = regexp.exec(host)) {
req.subdomains = req.subdomains[0].split('.').slice(0, -1);
server.emit("request", req, res);
} else {
next();
}
};
};

79
node_modules/connect/lib/patch.js generated vendored
View File

@@ -1,79 +0,0 @@
/*!
* Connect
* Copyright(c) 2011 TJ Holowaychuk
* MIT Licensed
*/
/**
* Module dependencies.
*/
var http = require('http')
, res = http.OutgoingMessage.prototype;
// original setHeader()
var setHeader = res.setHeader;
// original _renderHeaders()
var _renderHeaders = res._renderHeaders;
if (res._hasConnectPatch) return;
/**
* Provide a public "header sent" flag
* until node does.
*
* @return {Boolean}
* @api public
*/
res.__defineGetter__('headerSent', function(){
return this._headerSent;
});
/**
* Set header `field` to `val`, special-casing
* the `Set-Cookie` field for multiple support.
*
* @param {String} field
* @param {String} val
* @api public
*/
res.setHeader = function(field, val){
var key = field.toLowerCase()
, prev;
// special-case Set-Cookie
if (this._headers && 'set-cookie' == key) {
if (prev = this.getHeader(field)) {
val = Array.isArray(prev)
? prev.concat(val)
: [prev, val];
}
// charset
} else if ('content-type' == key && this.charset) {
val += '; charset=' + this.charset;
}
return setHeader.call(this, field, val);
};
/**
* Proxy `res.end()` to expose a 'header' event,
* allowing arbitrary augmentation before the header
* fields are written to the socket.
*
* NOTE: this _only_ supports node's progressive header
* field API aka `res.setHeader()`.
*/
res._renderHeaders = function(){
this.emit('header');
return _renderHeaders.call(this);
};
res._hasConnectPatch = true;

View File

@@ -1,75 +0,0 @@
<html>
<head>
<title>listing directory {directory}</title>
<style>{style}</style>
<script>
function $(id){
var el = 'string' == typeof id
? document.getElementById(id)
: id;
el.on = function(event, fn){
if ('content loaded' == event) event = 'DOMContentLoaded';
el.addEventListener(event, fn, false);
};
el.all = function(selector){
return $(el.querySelectorAll(selector));
};
el.each = function(fn){
for (var i = 0, len = el.length; i < len; ++i) {
fn($(el[i]), i);
}
};
el.getClasses = function(){
return this.getAttribute('class').split(/\s+/);
};
el.addClass = function(name){
var classes = this.getAttribute('class');
el.setAttribute('class', classes
? classes + ' ' + name
: name);
};
el.removeClass = function(name){
var classes = this.getClasses().filter(function(curr){
return curr != name;
});
this.setAttribute('class', classes);
};
return el;
}
function search() {
var str = $('search').value
, links = $('files').all('a');
links.each(function(link){
var text = link.textContent;
if ('..' == text) return;
if (str.length && ~text.indexOf(str)) {
link.addClass('highlight');
} else {
link.removeClass('highlight');
}
});
}
$(window).on('content loaded', function(){
$('search').on('keyup', search);
});
</script>
</head>
<body class="directory">
<input id="search" type="text" placeholder="Search" autocomplete="off" />
<div id="wrapper">
<h1>{linked-path}</h1>
{files}
</div>
</body>
</html>

View File

@@ -1,13 +0,0 @@
<html>
<head>
<title>{error}</title>
<style>{style}</style>
</head>
<body>
<div id="wrapper">
<h1>{title}</h1>
<h2><em>500</em> {error}</h2>
<ul id="stacktrace">{stack}</ul>
</div>
</body>
</html>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 635 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 739 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 794 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 818 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 663 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 740 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 807 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 793 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 817 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 879 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 833 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 779 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 621 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 801 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 839 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 830 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 813 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 703 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 641 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 858 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 774 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 294 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 591 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 664 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 512 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 587 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 656 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 666 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 603 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 587 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 592 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 724 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 309 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 621 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 700 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 639 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 579 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 536 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 638 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 618 B

Some files were not shown because too many files have changed in this diff Show More