Summary: See D19117. Instead of automatically figuring this out inside `phutil_tag()`, explicitly add rel="noreferrer" at the application level to all external links.
Test Plan:
- Grepped for `_blank`, `isValidRemoteURIForLink`, checked all callsites for user-controlled data.
- Created a link menu item, verified noreferrer in markup.
- Created a link custom field, verified no referrer in markup.
- Verified noreferrer for `{nav href=...}`.
Subscribers: PHID-OPKG-gm6ozazyms6q6i22gyam
Differential Revision: https://secure.phabricator.com/D19118
Summary:
Depends on D19009. Ref T13053. For "Must Encrypt" mail, we must currently strip the "Thread-Topic" header because it sometimes contains sensitive information about the object.
I don't actually know if this header is useful or anyting uses it. My understanding is that it's an Outlook/Exchange thing, but we also implement "Thread-Index" which I think is what Outlook/Exchange actually look at. This header may have done something before we implemented "Thread-Index", or maybe never done anything. Or maybe older versions of Excel/Outlook did something with it and newer versions don't, or do less. So it's possible that an even better fix here would be to simply remove this, but I wasn't able to convince myself of that after Googling for 10 minutes and I don't think it's worth hours of installing Exchange/Outlook to figure out. Instead, I'm just trying to simplify our handling of this header for now, and maybe some day we'll learn more about Exchange/Outlook and can remove it.
In a number of cases we already use the object monogram or PHID as a "Thread-Topic" without users ever complaining, so I think that if this header is useful it probably isn't shown to users, or isn't shown very often (e.g., only in a specific "conversation" sub-view?). Just use the object PHID (which should be unique and stable) as a thread-topic, everywhere, automatically.
Then allow this header through for "Must Encrypt" mail.
Test Plan: Processed some local mail, saw object PHIDs for "Thread-Topic" headers.
Reviewers: amckinley
Maniphest Tasks: T13053
Differential Revision: https://secure.phabricator.com/D19012
Summary:
Depends on D18952. Ref T13049. For files larger than 8MB, we need to engage the chunk storage engine. `PhabricatorFile::newFromFileData()` always writes a single chunk, and can't handle files larger than the mandatory chunk threshold (8MB).
Use `IteratorUploadSource`, which can, and "stream" the data into it. This should raise the limit from 8MB to 2GB (maximum size of a string in PHP).
If we need to go above 2GB we could stream CSV and text pretty easily, and JSON without too much trouble, but Excel might be trickier. Hopefully no one is trying to export 2GB+ datafiles, though.
Test Plan:
- Changed the JSON exporter to just export 8MB of the letter "q": `return str_repeat('q', 1024 * 1024 * 9);`.
- Before change: fatal, "no storage engine can store this file".
- After change: export works cleanly.
Reviewers: amckinley
Reviewed By: amckinley
Maniphest Tasks: T13049
Differential Revision: https://secure.phabricator.com/D18953
Summary:
See <https://discourse.phabricator-community.org/t/files-created-from-repository-contents-slightly-over-one-chunk-in-size-are-truncated-to-exactly-one-chunk-in-size/988/1>. Three issues here:
- When we finish reading `git cat-file ...` or whatever, we can end up with more than one chunk worth of bytes left in the internal buffer if the read is fast. Use `while` instead of `if` to make sure we write the whole buffer.
- Limiting output with `setStdoutSizeLimit()` isn't really a reliable way to limit the size if we're also reading from the buffer. It's also pretty indirect and confusing. Instead, just let the `FileUploadSource` explicitly implement a byte limit in a straightforward way.
- We weren't setting the time limit correctly on the main path.
Overall, this could cause >4MB files to "write" as 4MB files, with the rest of the file left in the UploadSource buffer. Since these files were technically under the limit, they could return as valid. This was intermittent.
Test Plan:
- Pushed a ~4.2MB file.
- Reloaded Diffusion a bunch, sometimes saw the `while/if` buffer race and produce a 4MB file with a prompt to download it. (Other times, the buffer worked right and the page just says "this file is too big, sorry").
- Applied patches.
- Reloaded Diffusion a bunch, no longer saw bad behavior or truncated files.
Reviewers: amckinley
Reviewed By: amckinley
Differential Revision: https://secure.phabricator.com/D18885
Summary:
Fixes T13031. "Enormous" changes are basically changes which are too large to hold in memory, although the actual definition we use today is "more than 1GB of change text or `git diff` runs for more than 15 minutes".
If an install configures a Herald content rule like "when content matches /XYZ/, do something" and then a user pushes a 30 GB source file, we can't put it into memory to `preg_match()` it. Currently, the way to handle this case is to write a separate Herald rule that rejects enormous changes. However, this isn't obvious and means the default behavior is unsafe.
Make the default behavior safe by rejecting these changes with a message, similar to how we reject "dangerous" changes (which permanently delete or overwrite history) by default.
Also, change a couple of UI strings from "Enormous" to "Very Large" to reduce ambiguity. See <https://discourse.phabricator-community.org/t/herald-enormous-check/822>.
Test Plan: Changed the definition of "enormous" from 1GB to 1 byte. Pushed a change; got rejected. Allowed enormous changes, pushed, got rejected by a Herald rule. Disabled the Herald rule, pushed, got a clean push. Prevented enormous changes again. Grepped for "enormous" elsewhere in the UI.
Reviewers: amckinley
Reviewed By: amckinley
Subscribers: joshuaspence
Maniphest Tasks: T13031
Differential Revision: https://secure.phabricator.com/D18850
Summary:
Depends on D18828. Ref T7789. See <https://discourse.phabricator-community.org/t/git-lfs-fails-with-large-images/584>.
Currently, when you upload a large (>4MB) image, we may try to assess the dimensions for the image and for each individual chunk.
At best, this is slow and not useful. At worst, it fatals or consumes a ton of memory and I/O we don't need to be using.
Instead:
- Don't try to assess dimensions for chunked files.
- Don't try to assess dimensions for the chunks themselves.
- Squelch errors for bad data, etc., that `gd` can't actually read, since we recover sensibly.
Test Plan:
- Created a 2048x2048 PNG in Photoshop using the "Random Noise" filter which weighs 8.5MB.
- Uploaded it.
- Before patch: got complaints in log about `imagecreatefromstring()` failing, although the actual upload went OK in my environment.
- After patch: clean log, no attempt to detect the size of a big image.
- Also uploaded a small image, got dimensions detected properly still.
Reviewers: amckinley
Reviewed By: amckinley
Maniphest Tasks: T7789
Differential Revision: https://secure.phabricator.com/D18830
Summary:
Ref T10405. If `gd` is not installed, a number of unit tests currently fail because they generate synthetic users and try to composite profile pictures for them.
Instead, just fall back to a static default if `gd` is not available.
Test Plan: Faked this pathway, created a new user, saw a default profile picture.
Reviewers: amckinley, lpriestley, avivey
Reviewed By: avivey
Maniphest Tasks: T10405
Differential Revision: https://secure.phabricator.com/D18812
Summary:
Depends on D18792. Fixes T13024. Fixes T89198. Currently, when users are logging in initially (for example, need to enter MFA) we show more menu items than we should.
Notably, we may show some personalized/private account details, like the number of unread notifications (probably not relevant) or a user's saved queries (possibly sensitive). At best these are misleading (they won't work yet) and there's an outside possibility they leak a little bit of private data.
Instead, nuke everything except "Log Out" when users have partial sessions.
Test Plan:
Hit a partial session (MFA required, email verification required) and looked at the menu. Only saw "Log Out".
{F5297713}
Reviewers: amckinley
Reviewed By: amckinley
Maniphest Tasks: T13024
Differential Revision: https://secure.phabricator.com/D18793
Summary: See PHI176. Depends on D18733. We issue a query when deleting files that currently doesn't hit any keys.
Test Plan:
Ran `./bin/remove destroy --force --trace F56376` to get the query.
Ran ##SELECT * FROM `file` WHERE storageEngine = 'blob' AND storageHandle = '23366' LIMIT 1## before and after the change.
Before:
```
mysql> explain SELECT * FROM `file` WHERE storageEngine = 'blob' AND storageHandle = '23366' LIMIT 1;
+----+-------------+-------+------+---------------+------+---------+------+-------+-------------+
| id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra |
+----+-------------+-------+------+---------------+------+---------+------+-------+-------------+
| 1 | SIMPLE | file | ALL | NULL | NULL | NULL | NULL | 33866 | Using where |
+----+-------------+-------+------+---------------+------+---------+------+-------+-------------+
1 row in set (0.01 sec)
```
After:
```
mysql> explain SELECT * FROM `file` WHERE storageEngine = 'blob' AND storageHandle = '23366' LIMIT 1;
+----+-------------+-------+------+---------------+------------+---------+-------------+------+------------------------------------+
| id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra |
+----+-------------+-------+------+---------------+------------+---------+-------------+------+------------------------------------+
| 1 | SIMPLE | file | ref | key_engine | key_engine | 388 | const,const | 190 | Using index condition; Using where |
+----+-------------+-------+------+---------------+------------+---------+-------------+------+------------------------------------+
1 row in set (0.00 sec)
```
Reviewers: amckinley
Reviewed By: amckinley
Differential Revision: https://secure.phabricator.com/D18734
Summary:
Ref T13008. Depends on D18701. The overall goal here is to make turning `enable_post_data_reading` off not break things, so we can run rate limiting checks before we read file uploads.
The biggest blocker for this is that turning it off stops `$_FILES` from coming into existence.
This //appears// to mostly work. Specifically:
- Skip the `max_post_size` check when POST is off, since it's meaningless.
- Don't read or scrub $_POST at startup when POST is off.
- When we rebuild REQUEST and POST before processing requests, do multipart parsing if we need to and rebuild FILES.
- Skip the `is_uploaded_file()` check if we built FILES ourselves.
This probably breaks a couple of small things, like maybe `__profile__` and other DarkConsole triggers over POST, and probably some other weird stuff. The parsers may also need more work than they've received so far.
I also need to verify that this actually works (i.e., lets us run code without reading the request body) but I'll include that in the change where I update the actual rate limiting.
Test Plan:
- Disabled `enable_post_data_reading`.
- Uploaded a file with a vanilla upload form (project profile image).
- Uploaded a file with drag and drop.
- Used DarkConsole.
- Submitted comments.
- Created a task.
- Browsed around.
Reviewers: amckinley
Reviewed By: amckinley
Maniphest Tasks: T13008
Differential Revision: https://secure.phabricator.com/D18702
Summary: Noticed a couple of typos in the docs, and then things got out of hand.
Test Plan:
- Stared at the words until my eyes watered and the letters began to swim on the screen.
- Consulted a dictionary.
Reviewers: #blessed_reviewers, epriestley
Reviewed By: #blessed_reviewers, epriestley
Subscribers: epriestley, yelirekim, PHID-OPKG-gm6ozazyms6q6i22gyam
Differential Revision: https://secure.phabricator.com/D18693
Summary:
Fixes T12948. See that task for substantial discussion and context. Briefly:
- This workflow is very old, and won't work for large (>2GB) files.
- This workflow has become more dangerous than it once was, and can fail in several ways that delete data and/or make recovery much more difficult (see T12948 for more discussion).
- This was originally added in D6068, which is a bit muddled, but looks like "one install ran into a weird issue so I wrote a script for them"; this would be a Consulting/Support issue and not come upstream today. I can't identify any arguments for retaining this workflow there, at least.
Test Plan:
- Grepped for `files purge`, got nothing.
- Grepped for `purge`, looked for anything that looked like instructions or documentation, got nothing.
I don't recall recommending anyone run this script in many years, and didn't even remember that it existed or what it did when T12948 was reported, so I believe it is not in widespread use.
Reviewers: joshuaspence, chad
Reviewed By: joshuaspence
Maniphest Tasks: T12948
Differential Revision: https://secure.phabricator.com/D18384
Summary: Cursory research indicates that "login" is a noun, referring to a form, and "log in" is a verb, referring to the action of logging in. I went though every instances of 'login' I could find and tried to clarify all this language. Also, we have "Phabricator" on the registration for like 4-5 times, which is a bit verbose, so I tried to simplify that language as well.
Test Plan: Tested logging in and logging out. Pages feel simpler.
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin
Differential Revision: https://secure.phabricator.com/D18322
Summary: Builds out a map for icon->image in Projects, selects the icon's image as the default project image if there is no custom image chosen by the user.
Test Plan: Select various icons, see image change. Test choose picture, pick a new image.
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin
Differential Revision: https://secure.phabricator.com/D18174
Summary:
Ref T12857. This is generally fairly fuzzy for now, but here's something concrete: when we build a large file with `diffusion.filecontentquery`, we compute the MIME type of all chunks, not just the initial chunk.
Instead, pass a dummy MIME type to non-initial chunks so we don't try to compute them. This mirrors logic elsewhere, in `file.uploadchunk`. This should perhaps be centralized at some point, but it's a bit tricky since the file doesn't know that it's a chunk until later.
Also, clean up the `TempFile` immediately -- this shouldn't actually affect anything, but we don't need it to live any longer than this.
Test Plan:
- Made `hashFileContent()` return `null` to skip the chunk cache.
- Added `phlog()` to the MIME type computation.
- Loaded a 12MB file in Diffusion.
- Before patch: Saw 3x MIME type computations, one for each 4MB chunk.
- After patch: Saw 1x MIME type computation, for initial chunk only.
Reviewers: chad, amckinley
Reviewed By: chad
Maniphest Tasks: T12857
Differential Revision: https://secure.phabricator.com/D18138
Summary: Fixes T12828. This API was tightened up D17616, but I missed this callsite.
Test Plan: I've just got a good feeling in my bones.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12828
Differential Revision: https://secure.phabricator.com/D18119
Summary: Ran across a few straglers. Convert to the correct color.
Test Plan: grep for profile-image-button, check profile image selection page
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin
Differential Revision: https://secure.phabricator.com/D18096
Summary: Try to dis-ambiguate various button types and colors. Moves `simple` to `phui-button-simple` and moves colors to `button-color`.
Test Plan: Grep for buttons still inline, UIExamples, PHUIX, Herald, and Email Preferences.
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin
Differential Revision: https://secure.phabricator.com/D18077
Summary: I'm not sure you can actually remove a project's image (maybe via the API?), but I kept the code for rendering the relevant title/feed anyway.
Test Plan: Unit tests + adding/changing project pictures.
Reviewers: #blessed_reviewers, epriestley
Reviewed By: #blessed_reviewers, epriestley
Subscribers: Korvin, epriestley
Differential Revision: https://secure.phabricator.com/D17954
Summary: Fixes T12614.
Test Plan:
- Ran tests, saw them pass.
- Changed a thumbnail to 800x800, saw tests detect that the default image was missing.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12614
Differential Revision: https://secure.phabricator.com/D17773
Summary:
Ref T7567. In T8266 I fixed a bunch of obscure "Range" issues, but only for file downloads -- not for Celerity.
Extend all that stuff to Celerity, which is fortunately much easier.
I believe this will fix Conpherence sounds in Safari.
Test Plan:
- Wrote out an HTTP request in a text file with `Range: bytes=0-1` and similar, piped it to localhost with `cat request.txt | nc localhost 80`, saw server return appropriate range responses consistent with file behavior after T8266, which all seems to work.
- Also did that for files to try to make sure I wasn't breaking anything.
Reviewers: chad, amckinley
Reviewed By: chad
Maniphest Tasks: T7567
Differential Revision: https://secure.phabricator.com/D17724
Summary: Fixes T12587. Adds a new `PhabricatorFileDeleteTransaction` that enqueues `File` delete tasks.
Test Plan:
- hack `PhabricatorFileQuery` to ignore isDeleted state
- stop daemons
- upload a file, delete it from the UI
- check that the DB has updated isDeleted = 1
- check timeline rendering in `File` detail view
- start daemons
- confirm rows are deleted from DB
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin, thoughtpolice
Maniphest Tasks: T12587
Differential Revision: https://secure.phabricator.com/D17723
Summary:
Deletion is a possibly time-intensive process, especially with large
files that are backed by high-latency, chunked storage (such as
S3). Even ~200mb objects take minutes to delete, which makes for an
unhappy experience. Fixes T10828.
Test Plan:
Delete a large file, and stare in awe of the swiftness with
which I am redirected to the main file application.
Reviewers: #blessed_reviewers, epriestley
Reviewed By: #blessed_reviewers, epriestley
Subscribers: thoughtpolice, Korvin
Maniphest Tasks: T10828
Differential Revision: https://secure.phabricator.com/D15743
Summary: Ensures that newly-made `File` objects get indexed into the new ngrams index. Fixes T8788.
Test Plan:
- uploaded a file with daemons stopped; confirmed no new rows in ngrams table
- started daemons; confirmed indexing of previously-uploaded files happened
- uploaded a new file with daemons running; confirmed it got added to the index
Not sure how to test the changes to `PhabricatorFileUploadSource->writeChunkedFile()` and `PhabricatorChunkedFileStorageEngine->allocateChunks()`. I spent a few minutes trying to find their callers, but the first looks like it requires a Diffusion repo and the 2nd is only accessible via Conduit. I can test that stuff if necessary, but it's such a small change that I'm not worried about it.
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin
Maniphest Tasks: T8788
Differential Revision: https://secure.phabricator.com/D17718
Summary: Follows the outline in D15656 for implementing ngram search for names of File objects. Also created FileFullTextEngine, because without implementing `PhabricatorFulltextInterface`, `./bin/search` complains that `File` is not an indexable type.
Test Plan:
- ran `./bin/storage upgrade` to apply the schema change
- confirmed the presence of a new `file_filename_ngrams` table
- added a couple file objects
- ran `bin/search index --type file --force`
- confirmed the presence of rows in `file_filename_ngrams`
- did a few keyword searches and saw expected results
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin
Maniphest Tasks: T8788
Differential Revision: https://secure.phabricator.com/D17702
Summary: Fixes T12536. Nothing reads this parameter; `PhabricatorFile::newChunkedFile` sets the `isPartial` flag automatically.
Test Plan: Grepped for `isPartial`.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12536
Differential Revision: https://secure.phabricator.com/D17654
Summary:
Fixes T12531. Strictness fallout from adding typechecking in D17616.
- `chunkedHash` is not a real parameter, so the new typechecking was unhappy about it.
- `mime-type` no longer allows `null`.
Test Plan:
- Ran `arc upload --conduit-uri ... 12MB.zero` on a 12MB file full of zeroes.
- Before patch: badness, failure, fallback to one-shot uploads.
- After patch: success and glory.
Reviewers: chad
Subscribers: joshuaspence
Maniphest Tasks: T12531
Differential Revision: https://secure.phabricator.com/D17651
Summary: Ref T12509. This encourages code to move away from HMAC+SHA1 by making the method name more obviously undesirable.
Test Plan: `grep`, browsed around.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12509
Differential Revision: https://secure.phabricator.com/D17632
Summary:
Ref T12509. This adds support for HMAC+SHA256 (instead of HMAC+SHA1). Although HMAC+SHA1 is not currently broken in any sense, SHA1 has a well-known collision and it's good to look at moving away from HMAC+SHA1.
The new mechanism also automatically generates and stores HMAC keys.
Currently, HMAC keys largely use a per-install constant defined in `security.hmac-key`. In theory this can be changed, but in practice essentially no install changes it.
We generally (in fact, always, I think?) don't use HMAC digests in a way where it matters that this key is well-known, but it's slightly better if this key is unique per class of use cases. Principally, if use cases have unique HMAC keys they are generally less vulnerable to precomputation attacks where an attacker might generate a large number of HMAC hashes of well-known values and use them in a nefarious way. The actual threat here is probably close to nonexistent, but we can harden against it without much extra effort.
Beyond that, this isn't something users should really have to think about or bother configuring.
Test Plan:
- Added unit tests.
- Used `bin/files integrity` to verify, strip, and recompute hashes.
- Tampered with a generated HMAC key, verified it invalidated hashes.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12509
Differential Revision: https://secure.phabricator.com/D17630
Summary:
Ref T12470. Provides an "integrity" utility which runs in these modes:
- Verify: check that hashes match.
- Compute: backfill missing hashes.
- Strip: remove hashes. Useful for upgrading across a hash change.
- Corrupt: intentionally corrupt hashes. Useful for debugging.
- Overwrite: force hash recomputation.
Users normally shouldn't need to run any of this stuff, but this provides a reasonable toolkit for managing integrity hashes.
I'll recommend existing installs use `bin/files integrity --compute all` in the upgrade guidance to backfill hashes for existing files.
Test Plan:
- Ran the script in many modes against various files, saw expected operation, including:
- Verified a file, corrupted it, saw it fail.
- Verified a file, stripped it, saw it have no hash.
- Stripped a file, computed it, got a clean verify.
- Stripped a file, overwrote it, got a clean verify.
- Corrupted a file, overwrote it, got a clean verify.
- Overwrote a file, overwrote again, got a no-op.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12470
Differential Revision: https://secure.phabricator.com/D17629
Summary:
Ref T12219. We currently only support Range requests like "bytes=123-456", but "bytes=123-", meaning "until end of file", is valid, and Chrome can send these requests.
I suspect this is the issue with T12219.
Test Plan: Used `nc local.phacility.com 80` to pipe raw requests, saw both "bytes=123-456" and "bytes=123-" requests satisfied correctly.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12219
Differential Revision: https://secure.phabricator.com/D17626
Summary:
Ref T12470. This helps defuse attacks where an adversary can directly take control of whatever storage engine files are being stored in and change data there. These attacks would require a significant level of access.
Such attackers could potentially attack ranges of AES-256-CBC encrypted files by using Phabricator as a decryption oracle if they were also able to compromise a Phabricator account with read access to the files.
By storing a hash of the data (and, in the case of AES-256-CBC files, the IV) when we write files, and verifying it before we decrypt or read them, we can detect and prevent this kind of tampering.
This also helps detect mundane corruption and integrity issues.
Test Plan:
- Added unit tests.
- Uploaded new files, saw them get integrity hashes.
- Manually corrupted file data, saw it fail. Used `bin/files cat --salvage` to read it anyway.
- Tampered with IVs, saw integrity failures.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12470
Differential Revision: https://secure.phabricator.com/D17625
Summary:
Fixes T12079. Currently, when a file is encrypted and a request has "Content-Range", we apply the range first, //then// decrypt the result. This doesn't work since you can't start decrypting something from somewhere in the middle (at least, not with our cipher selection).
Instead: decrypt the result, //then// apply the range.
Test Plan: Added failing unit tests, made them pass
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12079
Differential Revision: https://secure.phabricator.com/D17623
Summary:
Ref T12464. This defuses any possible SHA1-collision attacks by using SHA256, for which there is no known collision.
(SHA256 hashes are larger -- 256 bits -- so expand the storage column to 64 bytes to hold them.)
Test Plan:
- Uploaded the same file twice, saw the two files generate the same SHA256 content hash and use the same underlying data.
- Tried with a fake hash algorihtm ("quackxyz") to make sure the failure mode worked/degraded correctly if we don't have SHA256 for some reason. Got two valid files with two copies of the same data, as expected.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12464
Differential Revision: https://secure.phabricator.com/D17620
Summary:
Ref T12464. We currently use SHA1 to detect when two files have the same content so we don't have to store two copies of the data.
Now that a SHA1 collision is known, this is theoretically dangerous. T12464 describes the shape of a possible attack.
Before replacing this with something more robust, shore things up so things work correctly if we don't hash at all. This mechanism is entirely optional; it only helps us store less data if some files are duplicates.
(This mechanism is also less important now than it once was, before we added temporary files.)
Test Plan: Uploaded multiple identical files, saw the uploads work and the files store separate copies of the same data.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12464
Differential Revision: https://secure.phabricator.com/D17619
Summary:
Ref T12464. This is a very old method which let you create a file on the server by referring to data which already existed in another file.
Basically, long ago, `arc` could say "Do you already have a file with hash X?" and just skip some work if the server did.
`arc` has not called this method since D13017, in May 2015.
Since it's easy to do so, just make this method pretend that it never has the file. Very old clients will continue to work, since they would expect this response in the common case and continue by uploading data.
Test Plan:
- Grepped for `uploadhash` in Phabricator and Arcanist.
- Called the method with the console, verified it returned `null`.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12464
Differential Revision: https://secure.phabricator.com/D17618
Summary:
Ref T12464. This is a very old method which can return an existing file instead of creating a new one, if there's some existing file with the same content.
In the best case this is a bad idea. This being somewhat reasonable predates policies, temporary files, etc. Modern methods like `newFromFileData()` do this right: they share underlying data in storage, but not the actual `File` records.
Specifically, this is the case where we get into trouble:
- I upload a private file with content "X".
- You somehow generate a file with the same content by, say, viewing a raw diff in Differential.
- If the diff had the same content, you get my file, but you don't have permission to see it or whatever so everything breaks and is terrible.
Just get rid of this.
Test Plan:
- Generated an SSH key.
- Viewed a raw diff in Differential.
- (Did not test Phragment.)
Reviewers: chad
Reviewed By: chad
Subscribers: hach-que
Maniphest Tasks: T12464
Differential Revision: https://secure.phabricator.com/D17617
Summary:
Ref T11357. When creating a file, callers can currently specify a `ttl`. However, it isn't unambiguous what you're supposed to pass, and some callers get it wrong.
For example, to mean "this file expires in 60 minutes", you might pass either of these:
- `time() + phutil_units('60 minutes in seconds')`
- `phutil_units('60 minutes in seconds')`
The former means "60 minutes from now". The latter means "1 AM, January 1, 1970". In practice, because the GC normally runs only once every four hours (at least, until recently), and all the bad TTLs are cases where files are normally accessed immediately, these 1970 TTLs didn't cause any real problems.
Split `ttl` into `ttl.relative` and `ttl.absolute`, and make sure the values are sane. Then correct all callers, and simplify out the `time()` calls where possible to make switching to `PhabricatorTime` easier.
Test Plan:
- Generated an SSH keypair.
- Viewed a changeset.
- Viewed a raw diff.
- Viewed a commit's file data.
- Viewed a temporary file's details, saw expiration date and relative time.
- Ran unit tests.
- (Didn't really test Phragment.)
Reviewers: chad
Reviewed By: chad
Subscribers: hach-que
Maniphest Tasks: T11357
Differential Revision: https://secure.phabricator.com/D17616
Summary:
Ref T11357. In D17611, I added `file.search`, which includes a `"dataURI"`. Partly, this is building toward resolving T8348.
However, in some cases you can't GET this URI because of a security measure:
- You have not configured `security.alternate-file-domain`.
- The file isn't web-viewable.
- (The request isn't an LFS request.)
The goal of this security mechanism is just to protect against session hijacking, so it's also safe to disable it if the viewer didn't present any credentials (since that means there's nothing to hijack). Add that exception, and reorganize the code a little bit.
Test Plan:
- From the browser (with a session), tried to GET a binary data file. Got redirected.
- Got a download with POST.
- From the CLI (without a session), tried to GET a binary data file. Go a download.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T11357
Differential Revision: https://secure.phabricator.com/D17613
Summary: Ref T11357. Implements a modern `file.search` for files, and freezes `file.info`.
Test Plan: Ran `file.search` from the Conduit console.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T11357
Differential Revision: https://secure.phabricator.com/D17612
Summary:
Ref T11357. This moves editing and commenting (but not creation) to EditEngine.
Since only the name is really editable, this is pretty straightforward.
Test Plan: Renamed files; commented on files.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T11357
Differential Revision: https://secure.phabricator.com/D17611
Summary: Ref T11357. A lot of file creation doesn't go through transactions, so we only actually have one real transaction type: editing a file name.
Test Plan:
Created and edited files.
{F4559287}
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T11357
Differential Revision: https://secure.phabricator.com/D17610
Summary:
Ref T12444. A few issues:
- `x % (y - z)` doesn't generate values in the full range: the largest value is never generated. Instead, use `x % (1 + y - z)`.
- `digestToRange(1, count)` never generates 0. After fixing the first bug, it could generate `count`. The range of the arrays is `0..(count-1)`, inclusive. Generate the correct range instead.
- `unpack('L', ...)` can unpack a negative number on a 32-bit system. Use `& 0x7FFFFFFF` to mask off the sign bit so the result is always a positive integer.
- FileFinder might return arbitrary keys, but we rely on sequential keys (0, 1, 2, ...)
Test Plan:
- Used `bin/people profileimage ... --force` to regenerate images.
- Added some debugging to verify that the math seemed to be working.
Reviewers: chad
Reviewed By: chad
Maniphest Tasks: T12444
Differential Revision: https://secure.phabricator.com/D17543
Summary: Ref T10319. This adds a basic means of generating default profile images for users. You can generate them for everyone, a group of users, or force updates. This only generated images and stores them in files. It does not assign them to users.
Test Plan:
`bin/people profileimage --all` to generate all images.
`bin/people profileimage --users chad` to generate a user.
`bin/people profileimage --all --force` to force rebuilding all images.
{F3662810}
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin
Maniphest Tasks: T10319
Differential Revision: https://secure.phabricator.com/D17464
Summary: Ref T10319. This swaps the default in the Picture Chooser to allow picking of the custom unique avatar. We're currently going with 100k unique possibilities. The logic roughly hashes a user name and picks an image pack, color, and border. Based on that, we select the first character of their username, or fall back to Psyduck if not [a-z][0-9].
Test Plan:
Set the following usernames from ProfilePicture as a test: chad, epriestley, sally, 007, _cat_, -doggie-.
{F3453979}
Reviewers: epriestley
Reviewed By: epriestley
Subscribers: Korvin
Maniphest Tasks: T10319
Differential Revision: https://secure.phabricator.com/D17430