Compare commits

...

132 Commits

Author SHA1 Message Date
rinpatch f5305d2815 Merge branch 'backport/1.0-branch-name-change' into 'master'
Backport branch name changes to 1.0

See merge request pleroma/pleroma!1842
2019-10-14 16:55:32 +00:00
rinpatch 7aa17c3e75 Merge branch 'master-to-stable' into 'develop'
Preparations for renaming `master` to `stable`

See merge request pleroma/pleroma!1840
2019-10-14 19:41:04 +03:00
kaniini a067cf0f23 Merge branch 'backport/1.0-index-fix' into 'master'
Backport like index fix to 1.0

See merge request pleroma/pleroma!1740
2019-09-29 11:41:59 +00:00
rinpatch b60ec3b173 Add an index on object likes
In !1538 favorites timeline was switched to use the joined object, but
no idex on likes in the joined object was added.
2019-09-29 14:30:51 +03:00
rinpatch 7030721b31 Merge branch 'backport/1.0-hackney-and-outbox-fixes' into 'master'
Pleroma 1.0.7

See merge request pleroma/pleroma!1723
2019-09-26 22:05:33 +00:00
rinpatch df8fda267e Change the 1.0.7 release date to be based on UTC 2019-09-27 00:59:07 +03:00
rinpatch c8694d6d7b Bump version to 1.0.7 2019-09-27 00:44:10 +03:00
rinpatch 31421916c0 Apply suggestion to lib/pleroma/web/activity_pub/views/user_view.ex 2019-09-27 00:31:13 +03:00
rinpatch 13e7753382 Credo considered harmful 2019-09-27 00:31:07 +03:00
rinpatch 821fb2a584 Remove useless with clause 2019-09-27 00:31:00 +03:00
rinpatch 700d0a56a1 Apply suggestion to lib/pleroma/web/activity_pub/activity_pub_controller.ex 2019-09-27 00:30:52 +03:00
rinpatch 68e7bea354 Don't embed the first page in inboxes/outboxes and refactor the views to
follow View/Controller pattern

Note that I mentioned the change in 1.1 section because I intend to
backport this, if this is not needed I will move it back to Unreleased.
2019-09-27 00:30:10 +03:00
rinpatch 231e5e7914 Add a changelog entry for hackney fix 2019-09-27 00:21:13 +03:00
Haelwenn (lanodan) Monnier e17113d821 mix.lock: Bump hackney to 1.15.2
Closes: https://git.pleroma.social/pleroma/pleroma/issues/1267
2019-09-27 00:15:37 +03:00
kaniini b9188361dc Merge branch 'release/1.0.6' into 'master'
mix: bump version to 1.0.6

See merge request pleroma/pleroma!1571
2019-08-14 19:30:31 +00:00
Ariadne Conill 65cc68ff2f mix: bump version to 1.0.6 2019-08-14 19:29:41 +00:00
rinpatch 5999b4c5b9 Merge branch 'release/1.0.6' into 'master'
1.0.6 release

See merge request pleroma/pleroma!1570
2019-08-14 19:26:44 +00:00
kaniini 9aec8fe242 Apply suggestion to lib/pleroma/web/activity_pub/publisher.ex 2019-08-14 19:15:55 +00:00
Ariadne Conill 974488a52e activitypub: publisher: add (request-target) to http signature when POSTing 2019-08-14 19:15:50 +00:00
Ariadne Conill acc3c0ed58 MRF: fix up unserializable option lists in describe implementations 2019-08-14 19:09:45 +00:00
rinpatch 094db46923 Switch to pre-1.8 version of tzdata.
tzdata 1.0.0 requires Elixir 1.8.0, but we target 1.7. Fortunately
tzdata issues bugfix releases for pre-1.8.0 version.
2019-08-14 19:09:04 +00:00
kaniini d6da4a75ce Merge branch 'release/1.0.5' into 'master'
1.0.5 release

See merge request pleroma/pleroma!1549
2019-08-14 02:17:18 +00:00
Ariadne Conill dcb4711e92 mix: update version to 1.0.5 2019-08-14 02:08:56 +00:00
Ivan Tashkinov 9fa2011c17 Apply suggestion to CHANGELOG.md 2019-08-14 02:08:56 +00:00
Ivan Tashkinov bc6d1f9ed9 [#161] Refactoring, documentation. 2019-08-14 02:08:52 +00:00
Ivan Tashkinov 1a46a13d15 [#161] Limited replies depth on incoming federation in order to prevent memory leaks on recursive replies fetching. 2019-08-14 02:08:52 +00:00
Ivan Tashkinov e652f83e5a Apply suggestion to docs/config.md 2019-08-14 02:08:52 +00:00
Ariadne Conill 606ccbab03 update changelog to cover MRF describe API. 2019-08-14 01:50:26 +00:00
Ariadne Conill 4681747ec1 docs tweak 2019-08-14 01:50:26 +00:00
Ariadne Conill c7a3a15c3d mrf_vocabulary: add describe API support 2019-08-14 01:50:26 +00:00
Ariadne Conill b3afc0bb32 update changelog for mrf_vocabulary 2019-08-14 01:50:26 +00:00
Ariadne Conill d5df62edd8 tests: add tests for mrf_vocabulary 2019-08-14 01:50:26 +00:00
Ariadne Conill 7a5e1d64a5 docs: document mrf_vocabulary module settings 2019-08-14 01:50:26 +00:00
Ariadne Conill eed36278a6 MRF: add vocabulary policy module 2019-08-14 01:50:26 +00:00
Ariadne Conill fb77dc50aa fix credo 2019-08-14 01:50:26 +00:00
Ariadne Conill 3af51afdd7 tests: fix up nodeinfo tests 2019-08-14 01:50:26 +00:00
Ariadne Conill c9468c9a59 tests: add tests for MRF.describe() 2019-08-14 01:50:26 +00:00
Ariadne Conill 4ccd3410a8 nodeinfo: use MRF.describe() instead of hardcoded MRF transparency stuff 2019-08-14 01:50:26 +00:00
Ariadne Conill 7e2bc39f3c MRF: add describe() to all modules, add base MRF configuration to base describe() 2019-08-14 01:50:26 +00:00
Ariadne Conill b8186c26fa test: add mock MRF module for describe() testing 2019-08-14 01:50:26 +00:00
Ariadne Conill e13edc2d3b MRF: add describe() for gathering and describing the MRF configuration 2019-08-14 01:50:26 +00:00
Ariadne Conill 29db8dc799 config: remove legacy activitypub accept_blocks setting
Anyone who is interested in dropping blocks can write their own MRF
policy at this point.  This setting predated the MRF framework.

Disabling the side effect (unsubscription) is still a config option
per policy.
2019-08-14 01:50:26 +00:00
Ariadne Conill 54405919d8 changelog updates 2019-08-14 01:50:26 +00:00
rinpatch 5af3c00072 Do not fetch the reply object in `fix_type` unless the object has the
`name` key and use a depth limit when fetching it
2019-08-14 01:50:22 +00:00
Maksim 60c75d6740 #1110 fixed /api/pleroma/healthcheck 2019-08-14 01:50:22 +00:00
Ariadne Conill 14efbcf1f9 add removed section to changelog 2019-08-14 01:50:22 +00:00
Ariadne Conill 4d0dd04653 MRF: ensure that subdomain_match calls are case-insensitive 2019-08-14 01:50:22 +00:00
Sergey Suprunenko 25c818ed6f Redirect not logged-in users to the MastoFE login page on private instances 2019-08-14 01:50:22 +00:00
Ariadne Conill f7028ae8ac tests: mastodon account views: add missing CommonAPI alias 2019-08-14 01:50:22 +00:00
Sergey Suprunenko 48bd4ee933 Strip internal fields including likes from incoming and outgoing activities 2019-08-14 01:50:22 +00:00
rinpatch 7e412fd88a Do not rembed the object after updating it 2019-08-14 01:50:22 +00:00
rinpatch f0c32364b7 OStatus tests: stop relying on embedded objects 2019-08-14 01:50:22 +00:00
rinpatch e5161961bd ActivityPub tests: remove assertions of embedded object being updated,
because the objects are no longer supposed to be embedded
2019-08-14 01:50:22 +00:00
rinpatch 1807d3a115 OStatus Announce Representer: Do not depend on the object being embedded
in the Create activity
2019-08-14 01:50:21 +00:00
rinpatch b85840b536 Stop depending on the embedded object in restrict_favorited_by 2019-08-14 01:50:21 +00:00
Ariadne Conill a7b947534c chase changelog updates 2019-08-14 01:50:21 +00:00
Alexander Strizhakov b9def3758a Feature/1087 wildcard option for blocks 2019-08-14 01:50:21 +00:00
Sachin Joshi e08b97853f add listener port and ip option for 'pleroma.instance gen' and enable its test 2019-08-14 01:50:21 +00:00
Ariadne Conill 8a8fe57670 tasks: relay: add list task 2019-08-14 01:50:21 +00:00
Sergey Suprunenko 1310cdc24f Handle MRF rejections of incoming AP activities 2019-08-14 01:50:21 +00:00
Haelwenn (lanodan) Monnier 54d4ceec5c tasks/pleroma/user.ex: Fix documentation of --max-use and --expire-at
Closes: https://git.pleroma.social/pleroma/pleroma/issues/1155

[ci skip]
2019-08-14 01:50:21 +00:00
Haelwenn (lanodan) Monnier f270e33208 tasks/pleroma/instance.ex: Change :upload_dir to :uploads_dir
Closes: https://git.pleroma.social/pleroma/pleroma/issues/1058
2019-08-14 01:50:21 +00:00
Ariadne Conill d2c3c64da5 update changelog for lanodan's change 2019-08-14 01:50:21 +00:00
Haelwenn (lanodan) Monnier 65bd927cf2 templates/layout/app.html.eex: Style anchors
[ci skip]
2019-08-14 01:50:21 +00:00
Ariadne Conill 951d8c28d1 update changelog for thibg's change 2019-08-14 01:50:21 +00:00
Thibaut Girka a3654d9479 Return profile URL in MastodonAPI's `url` field 2019-08-14 01:50:21 +00:00
Thibaut Girka b29dcc8c1b Simplify logic to mention.js `url` field
`User.profile_url` already fallbacks to ap_id
2019-08-14 01:50:21 +00:00
Thibaut Girka 6525fbac95 Return profile URL when available instead of actor URI for MastodonAPI mention URL
Fixes #1165
2019-08-14 01:50:21 +00:00
rinpatch ba21d515d6 Mastodon API: Fix thread mute detection
It was calling CommonAPI.thread_muted? with post author's account
instead of viewer's one.
2019-08-14 01:50:21 +00:00
rinpatch dcd30e3ceb Mastodon API: Set follower/following counters to 0 when hiding
followers/following is enabled

We are already doing that in AP representation, so I think we should do
it here as well for consistency.
2019-08-14 01:50:21 +00:00
kaniini 9c499435f0 Merge branch 'release/1.0.4' into 'master'
1.0.4 release

See merge request pleroma/pleroma!1519
2019-08-01 16:41:09 +00:00
Ariadne Conill 455f2934b7 mix: update version to 1.0.4 2019-08-01 16:39:59 +00:00
rinpatch 8998620d71 Fix Invalid SemVer version generation
when the current branch does not have commits ahead of tag/checked out on a tag
2019-08-01 16:39:50 +00:00
kaniini 0eff6349a0 Merge branch 'release/1.0.3' into 'master'
1.0.3 release

See merge request pleroma/pleroma!1514
2019-07-31 20:27:09 +00:00
Ariadne Conill 2536628cac mix: update version to 1.0.3 2019-07-31 20:16:12 +00:00
Ariadne Conill d006742fd7 changelog again 2019-07-31 20:14:53 +00:00
Ariadne Conill e5cb15ce2b twitter api: utils: rework do_remote_follow() to use CommonAPI
Closes #1138
2019-07-31 20:12:25 +00:00
Ariadne Conill d18a7dc0de changelog update 2019-07-31 20:11:02 +00:00
rinpatch 2c2c075fd6 Disallow following locked accounts over OStatus 2019-07-31 20:08:59 +00:00
Ariadne Conill 59e60c6db1 ostatus: explicitly disallow protocol downgrade from activitypub
This closes embargoed bug #1135.
2019-07-31 18:57:52 +00:00
kaniini cca9d64cb8 Merge branch '1048-semver-format-compliance-backport' into 'master'
Fixed version parsing in pleroma_ctl (backport)

See merge request pleroma/pleroma!1508
2019-07-31 15:13:38 +00:00
Ivan Tashkinov daf0d0dd9a [#1036] Changelog adjustment. 2019-07-31 09:33:39 +03:00
Ivan Tashkinov 1e77e71fb2 [#1036] Fixed version parsing in pleroma_ctl. 2019-07-30 15:33:51 +03:00
Ivan Tashkinov 7fdbef3e1b [#1048] Fixed version parsing in pleroma_ctl. Closes #1036. 2019-07-30 15:24:39 +03:00
kaniini ce6dfb6f06 Merge branch 'release/1.0.2' into 'master'
1.0.2 release

See merge request pleroma/pleroma!1499
2019-07-29 16:28:53 +00:00
Ariadne Conill d9aacbec4d mix: update version to 1.0.2 2019-07-28 23:15:19 +00:00
lain 3e3b00f658 Dependencies: Update tzdata and related packages. 2019-07-28 23:13:48 +00:00
Ariadne Conill f685e887b3 transmogrifier: use User.delete() instead of handrolled user deletion code for remote users
Closes #1104
2019-07-28 23:09:55 +00:00
Sergey Suprunenko 2b38961bf6 Handle 303 redirects 2019-07-28 22:50:47 +00:00
rinpatch 99989758ff Add a changelog entry for tolerating incorrect chain order 2019-07-28 22:48:16 +00:00
rinpatch 1426358538 Workaround for remote server certificate chain issues 2019-07-28 22:47:46 +00:00
rinpatch 2914f8a749 Merge the default options with custom ones in ReverseProxy and
Pleroma.HTTP
2019-07-28 22:47:41 +00:00
Bradley D. Thornton 8b1b3e84f4 Update otp_en.md - Added creation of initial admin user 2019-07-28 22:47:21 +00:00
tom79 1c364f7407 Fix domain for the contact clients.md 2019-07-28 22:46:04 +00:00
tom79 109e7813c9 Update clients.md for Fedilab
- Change owner (@fedilab@framapiaf.org), Source code: Framagit, Add some other supported features
2019-07-28 22:45:57 +00:00
rinpatch 15aaffc6d8 Add a changelog entry for changing the defaults 2019-07-28 22:44:51 +00:00
rinpatch 67c5e6541e Formatting 2019-07-28 22:44:14 +00:00
rinpatch 7486a917d4 Improve wording in CHANGELOG.md 2019-07-28 22:43:58 +00:00
rinpatch 0c028f345a Enable OpenGraph and TwitterCard by default
Closes #1034
2019-07-28 22:43:19 +00:00
rinpatch 48aed88dbd FallbackRedirector: Do not crash on Metadata rendering errors 2019-07-28 22:43:11 +00:00
rinpatch fd4963006a OGP/TwitterCard: Add fallbacks in case the attachment key is nonexistent 2019-07-28 22:42:38 +00:00
rinpatch 876b5b45a3 OTP Release install docs: Remove --dry-run in cron certbot command 2019-07-28 22:42:06 +00:00
Ariadne Conill 3ec6f34ef0 update Changelog 2019-07-28 22:41:31 +00:00
rinpatch 6a35c151c6 Fix not being able to pin unlisted posts
Closes #1038
2019-07-28 22:39:10 +00:00
Sachin Joshi 1e5d889aec preserve the original path/filename (no encoding/decoding) for proxy 2019-07-28 22:37:18 +00:00
Sachin Joshi ccafecf9be try to always match the filename for proxy url 2019-07-28 22:36:42 +00:00
Roman Chvanikov 9e88ab2cad Split alters rather than work with indexes 2019-07-28 22:35:15 +00:00
Roman Chvanikov 494a611afe Fix migration 2019-07-28 22:35:03 +00:00
lain ed639376ac Mastodon Controller: Fix tests. 2019-07-28 22:33:17 +00:00
lain 8123578bf8 Status View: Poll ids are strings.
All ids in mastodon are strings, in general.
2019-07-28 22:33:09 +00:00
kaniini 5cb37412a2 Merge branch 'release/1.0.1' into 'master'
version bump to 1.0.1

See merge request pleroma/pleroma!1421
2019-07-14 20:54:06 +00:00
Ariadne Conill e01bab843b version bump to 1.0.1 2019-07-14 20:53:10 +00:00
kaniini ba26208cec Merge branch 'release/1.0.1' into 'master'
1.0.1 release

See merge request pleroma/pleroma!1420
2019-07-14 20:31:57 +00:00
Ariadne Conill f1147a3d7f fix backport 2019-07-14 20:02:39 +00:00
Haelwenn (lanodan) Monnier c51b2abead HttpRequestMock: Add 404s on OStatus fetching for info.pleroma.site 2019-07-14 20:00:48 +00:00
Haelwenn (lanodan) Monnier 3e298cc85a HttpRequestMock: Add missing mocks for object containment tests 2019-07-14 20:00:17 +00:00
Ariadne Conill 7523ab1495 object: fix backport 2019-07-14 19:39:37 +00:00
Ariadne Conill 6b12cb60e9 docs: note that exclusions usage will be included in the transparency metrics if used 2019-07-14 19:32:07 +00:00
Ariadne Conill cdf2ff8176 nodeinfo: implement MRF transparency exclusions 2019-07-14 19:31:55 +00:00
Haelwenn (lanodan) Monnier 48927b1d3b Object.Fetcher: Keep the with-do block as per kaniini proposition 2019-07-14 19:30:39 +00:00
Haelwenn (lanodan) Monnier 1c79ec2c08 FetcherTest: Containment refute called(OStatus.fetch_activity_from_url) 2019-07-14 19:30:34 +00:00
Haelwenn (lanodan) Monnier e7a472a11f Object.Fetcher: Fallback to OStatus only if AP actually fails 2019-07-14 19:30:13 +00:00
Ariadne Conill 5e9befc7d4 tests: fix object containment violations in the transmogrifier tests
Some objects were not completely rewritten in the tests, which caused object
containment violations.  Fix them by rewriting the object IDs to be in an
appropriate namespace.
2019-07-14 19:29:15 +00:00
Ariadne Conill 6d715b7702 security: detect object containment violations at the IR level
It is more efficient to check for object containment violations at the IR
level instead of in the protocol handlers.  OStatus containment is especially
a tricky situation, as the containment rules don't match those of IR and
ActivityPub.

Accordingly, we just always do a final containment check at the IR level
before the object is added to the IR object graph.
2019-07-14 19:28:47 +00:00
Ariadne Conill 73a3dbe31e remove CC-BY-NC-ND license.
we moved branding assets (mascot etc) to CC-BY-SA a while back.
2019-07-14 19:28:32 +00:00
rinpatch 9504544d5b Merge branch 'revert/tags-releases' into 'master'
Revert "Build releases only on tags or develop"

See merge request pleroma/pleroma!1356
2019-06-29 10:35:32 +00:00
rinpatch 5bc9e6e0f3 Revert "Build releases only on tags or develop"
This reverts commit 54d2873770. It breaks
getting the latest artifact for the master branch.
2019-06-29 13:24:26 +03:00
rinpatch e9426d15d8 Merge branch 'rebase/docs-master' into 'master'
Cherry-pick documentation fixes to master

See merge request pleroma/pleroma!1354
2019-06-29 09:49:38 +00:00
rinpatch 54d2873770 Build releases only on tags or develop
Needed so we could push documentation updates to master without
triggering a rebuild
2019-06-29 12:42:19 +03:00
rinpatch 9b014eae68 Add a warning about service files assuming installation paths 2019-06-29 12:13:52 +03:00
Yuji Nakao 69585a3218 Move all upload contents despite upload folder itself. 2019-06-29 12:13:45 +03:00
Yuji Nakao d9f8230d2c Update migrating_from_source_otp_en.md: Replace `reload-daemon` with `daemon-reload`. 2019-06-29 12:13:32 +03:00
96 changed files with 1915 additions and 888 deletions

View File

@ -86,7 +86,7 @@ docs-deploy:
stage: deploy
image: alpine:3.9
only:
- master@pleroma/pleroma
- stable@pleroma/pleroma
- develop@pleroma/pleroma
before_script:
- apk update && apk add openssh-client rsync
@ -148,8 +148,10 @@ amd64:
# TODO: Replace with upstream image when 1.9.0 comes out
image: rinpatch/elixir:1.9.0-rc.0
only: &release-only
- master@pleroma/pleroma
- stable@pleroma/pleroma
- develop@pleroma/pleroma
- /^maint/.*$/@pleroma/pleroma
- /^release/.*$/@pleroma/pleroma
artifacts: &release-artifacts
name: "pleroma-$CI_COMMIT_REF_NAME-$CI_COMMIT_SHORT_SHA-$CI_JOB_NAME"
paths:

View File

@ -1,403 +0,0 @@
Attribution-NonCommercial-NoDerivatives 4.0 International
=======================================================================
Creative Commons Corporation ("Creative Commons") is not a law firm and
does not provide legal services or legal advice. Distribution of
Creative Commons public licenses does not create a lawyer-client or
other relationship. Creative Commons makes its licenses and related
information available on an "as-is" basis. Creative Commons gives no
warranties regarding its licenses, any material licensed under their
terms and conditions, or any related information. Creative Commons
disclaims all liability for damages resulting from their use to the
fullest extent possible.
Using Creative Commons Public Licenses
Creative Commons public licenses provide a standard set of terms and
conditions that creators and other rights holders may use to share
original works of authorship and other material subject to copyright
and certain other rights specified in the public license below. The
following considerations are for informational purposes only, are not
exhaustive, and do not form part of our licenses.
Considerations for licensors: Our public licenses are
intended for use by those authorized to give the public
permission to use material in ways otherwise restricted by
copyright and certain other rights. Our licenses are
irrevocable. Licensors should read and understand the terms
and conditions of the license they choose before applying it.
Licensors should also secure all rights necessary before
applying our licenses so that the public can reuse the
material as expected. Licensors should clearly mark any
material not subject to the license. This includes other CC-
licensed material, or material used under an exception or
limitation to copyright. More considerations for licensors:
wiki.creativecommons.org/Considerations_for_licensors
Considerations for the public: By using one of our public
licenses, a licensor grants the public permission to use the
licensed material under specified terms and conditions. If
the licensor's permission is not necessary for any reason--for
example, because of any applicable exception or limitation to
copyright--then that use is not regulated by the license. Our
licenses grant only permissions under copyright and certain
other rights that a licensor has authority to grant. Use of
the licensed material may still be restricted for other
reasons, including because others have copyright or other
rights in the material. A licensor may make special requests,
such as asking that all changes be marked or described.
Although not required by our licenses, you are encouraged to
respect those requests where reasonable. More considerations
for the public:
wiki.creativecommons.org/Considerations_for_licensees
=======================================================================
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0
International Public License
By exercising the Licensed Rights (defined below), You accept and agree
to be bound by the terms and conditions of this Creative Commons
Attribution-NonCommercial-NoDerivatives 4.0 International Public
License ("Public License"). To the extent this Public License may be
interpreted as a contract, You are granted the Licensed Rights in
consideration of Your acceptance of these terms and conditions, and the
Licensor grants You such rights in consideration of benefits the
Licensor receives from making the Licensed Material available under
these terms and conditions.
Section 1 -- Definitions.
a. Adapted Material means material subject to Copyright and Similar
Rights that is derived from or based upon the Licensed Material
and in which the Licensed Material is translated, altered,
arranged, transformed, or otherwise modified in a manner requiring
permission under the Copyright and Similar Rights held by the
Licensor. For purposes of this Public License, where the Licensed
Material is a musical work, performance, or sound recording,
Adapted Material is always produced where the Licensed Material is
synched in timed relation with a moving image.
b. Copyright and Similar Rights means copyright and/or similar rights
closely related to copyright including, without limitation,
performance, broadcast, sound recording, and Sui Generis Database
Rights, without regard to how the rights are labeled or
categorized. For purposes of this Public License, the rights
specified in Section 2(b)(1)-(2) are not Copyright and Similar
Rights.
c. Effective Technological Measures means those measures that, in the
absence of proper authority, may not be circumvented under laws
fulfilling obligations under Article 11 of the WIPO Copyright
Treaty adopted on December 20, 1996, and/or similar international
agreements.
d. Exceptions and Limitations means fair use, fair dealing, and/or
any other exception or limitation to Copyright and Similar Rights
that applies to Your use of the Licensed Material.
e. Licensed Material means the artistic or literary work, database,
or other material to which the Licensor applied this Public
License.
f. Licensed Rights means the rights granted to You subject to the
terms and conditions of this Public License, which are limited to
all Copyright and Similar Rights that apply to Your use of the
Licensed Material and that the Licensor has authority to license.
g. Licensor means the individual(s) or entity(ies) granting rights
under this Public License.
h. NonCommercial means not primarily intended for or directed towards
commercial advantage or monetary compensation. For purposes of
this Public License, the exchange of the Licensed Material for
other material subject to Copyright and Similar Rights by digital
file-sharing or similar means is NonCommercial provided there is
no payment of monetary compensation in connection with the
exchange.
i. Share means to provide material to the public by any means or
process that requires permission under the Licensed Rights, such
as reproduction, public display, public performance, distribution,
dissemination, communication, or importation, and to make material
available to the public including in ways that members of the
public may access the material from a place and at a time
individually chosen by them.
j. Sui Generis Database Rights means rights other than copyright
resulting from Directive 96/9/EC of the European Parliament and of
the Council of 11 March 1996 on the legal protection of databases,
as amended and/or succeeded, as well as other essentially
equivalent rights anywhere in the world.
k. You means the individual or entity exercising the Licensed Rights
under this Public License. Your has a corresponding meaning.
Section 2 -- Scope.
a. License grant.
1. Subject to the terms and conditions of this Public License,
the Licensor hereby grants You a worldwide, royalty-free,
non-sublicensable, non-exclusive, irrevocable license to
exercise the Licensed Rights in the Licensed Material to:
a. reproduce and Share the Licensed Material, in whole or
in part, for NonCommercial purposes only; and
b. produce and reproduce, but not Share, Adapted Material
for NonCommercial purposes only.
2. Exceptions and Limitations. For the avoidance of doubt, where
Exceptions and Limitations apply to Your use, this Public
License does not apply, and You do not need to comply with
its terms and conditions.
3. Term. The term of this Public License is specified in Section
6(a).
4. Media and formats; technical modifications allowed. The
Licensor authorizes You to exercise the Licensed Rights in
all media and formats whether now known or hereafter created,
and to make technical modifications necessary to do so. The
Licensor waives and/or agrees not to assert any right or
authority to forbid You from making technical modifications
necessary to exercise the Licensed Rights, including
technical modifications necessary to circumvent Effective
Technological Measures. For purposes of this Public License,
simply making modifications authorized by this Section 2(a)
(4) never produces Adapted Material.
5. Downstream recipients.
a. Offer from the Licensor -- Licensed Material. Every
recipient of the Licensed Material automatically
receives an offer from the Licensor to exercise the
Licensed Rights under the terms and conditions of this
Public License.
b. No downstream restrictions. You may not offer or impose
any additional or different terms or conditions on, or
apply any Effective Technological Measures to, the
Licensed Material if doing so restricts exercise of the
Licensed Rights by any recipient of the Licensed
Material.
6. No endorsement. Nothing in this Public License constitutes or
may be construed as permission to assert or imply that You
are, or that Your use of the Licensed Material is, connected
with, or sponsored, endorsed, or granted official status by,
the Licensor or others designated to receive attribution as
provided in Section 3(a)(1)(A)(i).
b. Other rights.
1. Moral rights, such as the right of integrity, are not
licensed under this Public License, nor are publicity,
privacy, and/or other similar personality rights; however, to
the extent possible, the Licensor waives and/or agrees not to
assert any such rights held by the Licensor to the limited
extent necessary to allow You to exercise the Licensed
Rights, but not otherwise.
2. Patent and trademark rights are not licensed under this
Public License.
3. To the extent possible, the Licensor waives any right to
collect royalties from You for the exercise of the Licensed
Rights, whether directly or through a collecting society
under any voluntary or waivable statutory or compulsory
licensing scheme. In all other cases the Licensor expressly
reserves any right to collect such royalties, including when
the Licensed Material is used other than for NonCommercial
purposes.
Section 3 -- License Conditions.
Your exercise of the Licensed Rights is expressly made subject to the
following conditions.
a. Attribution.
1. If You Share the Licensed Material, You must:
a. retain the following if it is supplied by the Licensor
with the Licensed Material:
i. identification of the creator(s) of the Licensed
Material and any others designated to receive
attribution, in any reasonable manner requested by
the Licensor (including by pseudonym if
designated);
ii. a copyright notice;
iii. a notice that refers to this Public License;
iv. a notice that refers to the disclaimer of
warranties;
v. a URI or hyperlink to the Licensed Material to the
extent reasonably practicable;
b. indicate if You modified the Licensed Material and
retain an indication of any previous modifications; and
c. indicate the Licensed Material is licensed under this
Public License, and include the text of, or the URI or
hyperlink to, this Public License.
For the avoidance of doubt, You do not have permission under
this Public License to Share Adapted Material.
2. You may satisfy the conditions in Section 3(a)(1) in any
reasonable manner based on the medium, means, and context in
which You Share the Licensed Material. For example, it may be
reasonable to satisfy the conditions by providing a URI or
hyperlink to a resource that includes the required
information.
3. If requested by the Licensor, You must remove any of the
information required by Section 3(a)(1)(A) to the extent
reasonably practicable.
Section 4 -- Sui Generis Database Rights.
Where the Licensed Rights include Sui Generis Database Rights that
apply to Your use of the Licensed Material:
a. for the avoidance of doubt, Section 2(a)(1) grants You the right
to extract, reuse, reproduce, and Share all or a substantial
portion of the contents of the database for NonCommercial purposes
only and provided You do not Share Adapted Material;
b. if You include all or a substantial portion of the database
contents in a database in which You have Sui Generis Database
Rights, then the database in which You have Sui Generis Database
Rights (but not its individual contents) is Adapted Material; and
c. You must comply with the conditions in Section 3(a) if You Share
all or a substantial portion of the contents of the database.
For the avoidance of doubt, this Section 4 supplements and does not
replace Your obligations under this Public License where the Licensed
Rights include other Copyright and Similar Rights.
Section 5 -- Disclaimer of Warranties and Limitation of Liability.
a. UNLESS OTHERWISE SEPARATELY UNDERTAKEN BY THE LICENSOR, TO THE
EXTENT POSSIBLE, THE LICENSOR OFFERS THE LICENSED MATERIAL AS-IS
AND AS-AVAILABLE, AND MAKES NO REPRESENTATIONS OR WARRANTIES OF
ANY KIND CONCERNING THE LICENSED MATERIAL, WHETHER EXPRESS,
IMPLIED, STATUTORY, OR OTHER. THIS INCLUDES, WITHOUT LIMITATION,
WARRANTIES OF TITLE, MERCHANTABILITY, FITNESS FOR A PARTICULAR
PURPOSE, NON-INFRINGEMENT, ABSENCE OF LATENT OR OTHER DEFECTS,
ACCURACY, OR THE PRESENCE OR ABSENCE OF ERRORS, WHETHER OR NOT
KNOWN OR DISCOVERABLE. WHERE DISCLAIMERS OF WARRANTIES ARE NOT
ALLOWED IN FULL OR IN PART, THIS DISCLAIMER MAY NOT APPLY TO YOU.
b. TO THE EXTENT POSSIBLE, IN NO EVENT WILL THE LICENSOR BE LIABLE
TO YOU ON ANY LEGAL THEORY (INCLUDING, WITHOUT LIMITATION,
NEGLIGENCE) OR OTHERWISE FOR ANY DIRECT, SPECIAL, INDIRECT,
INCIDENTAL, CONSEQUENTIAL, PUNITIVE, EXEMPLARY, OR OTHER LOSSES,
COSTS, EXPENSES, OR DAMAGES ARISING OUT OF THIS PUBLIC LICENSE OR
USE OF THE LICENSED MATERIAL, EVEN IF THE LICENSOR HAS BEEN
ADVISED OF THE POSSIBILITY OF SUCH LOSSES, COSTS, EXPENSES, OR
DAMAGES. WHERE A LIMITATION OF LIABILITY IS NOT ALLOWED IN FULL OR
IN PART, THIS LIMITATION MAY NOT APPLY TO YOU.
c. The disclaimer of warranties and limitation of liability provided
above shall be interpreted in a manner that, to the extent
possible, most closely approximates an absolute disclaimer and
waiver of all liability.
Section 6 -- Term and Termination.
a. This Public License applies for the term of the Copyright and
Similar Rights licensed here. However, if You fail to comply with
this Public License, then Your rights under this Public License
terminate automatically.
b. Where Your right to use the Licensed Material has terminated under
Section 6(a), it reinstates:
1. automatically as of the date the violation is cured, provided
it is cured within 30 days of Your discovery of the
violation; or
2. upon express reinstatement by the Licensor.
For the avoidance of doubt, this Section 6(b) does not affect any
right the Licensor may have to seek remedies for Your violations
of this Public License.
c. For the avoidance of doubt, the Licensor may also offer the
Licensed Material under separate terms or conditions or stop
distributing the Licensed Material at any time; however, doing so
will not terminate this Public License.
d. Sections 1, 5, 6, 7, and 8 survive termination of this Public
License.
Section 7 -- Other Terms and Conditions.
a. The Licensor shall not be bound by any additional or different
terms or conditions communicated by You unless expressly agreed.
b. Any arrangements, understandings, or agreements regarding the
Licensed Material not stated herein are separate from and
independent of the terms and conditions of this Public License.
Section 8 -- Interpretation.
a. For the avoidance of doubt, this Public License does not, and
shall not be interpreted to, reduce, limit, restrict, or impose
conditions on any use of the Licensed Material that could lawfully
be made without permission under this Public License.
b. To the extent possible, if any provision of this Public License is
deemed unenforceable, it shall be automatically reformed to the
minimum extent necessary to make it enforceable. If the provision
cannot be reformed, it shall be severed from this Public License
without affecting the enforceability of the remaining terms and
conditions.
c. No term or condition of this Public License will be waived and no
failure to comply consented to unless expressly agreed to by the
Licensor.
d. Nothing in this Public License constitutes or may be interpreted
as a limitation upon, or waiver of, any privileges and immunities
that apply to the Licensor or You, including from the legal
processes of any jurisdiction or authority.
=======================================================================
Creative Commons is not a party to its public
licenses. Notwithstanding, Creative Commons may elect to apply one of
its public licenses to material it publishes and in those instances
will be considered the “Licensor.” The text of the Creative Commons
public licenses is dedicated to the public domain under the CC0 Public
Domain Dedication. Except for the limited purpose of indicating that
material is shared under a Creative Commons public license or as
otherwise permitted by the Creative Commons policies published at
creativecommons.org/policies, Creative Commons does not authorize the
use of the trademark "Creative Commons" or any other trademark or logo
of Creative Commons without its prior written consent including,
without limitation, in connection with any unauthorized modifications
to any of its public licenses or any other arrangements,
understandings, or agreements concerning use of licensed material. For
the avoidance of doubt, this paragraph does not form part of the
public licenses.
Creative Commons may be contacted at creativecommons.org.

View File

@ -3,6 +3,78 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## [1.0.7] - 2019-09-26
### Fixed
- Broken federation on Erlang 22 (previous versions of hackney http client were using an option that got deprecated)
### Changed
- ActivityPub: The first page in inboxes/outboxes is no longer embedded.
## [1.0.6] - 2019-08-14
### Fixed
- MRF: fix use of unserializable keyword lists in describe() implementations
- ActivityPub S2S: POST requests are now signed with `(request-target)` pseudo-header.
## [1.0.5] - 2019-08-13
### Fixed
- Mastodon API: follower/following counters not being nullified, when `hide_follows`/`hide_followers` is set
- Mastodon API: `muted` in the Status entity, using author's account to determine if the thread was muted
- Mastodon API: return the actual profile URL in the Account entity's `url` property when appropriate
- Templates: properly style anchor tags
- Objects being re-embedded to activities after being updated (e.g faved/reposted). Running 'mix pleroma.database prune_objects' again is advised.
- Not being able to access the Mastodon FE login page on private instances
- MRF: ensure that subdomain_match calls are case-insensitive
- Fix internal server error when using the healthcheck API.
### Added
- **Breaking:** MRF describe API, which adds support for exposing configuration information about MRF policies to NodeInfo.
Custom modules will need to be updated by adding, at the very least, `def describe, do: {:ok, %{}}` to the MRF policy modules.
- Relays: Added a task to list relay subscriptions.
- MRF: Support for filtering posts based on ActivityStreams vocabulary (`Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`)
- MRF (Simple Policy): Support for wildcard domains.
- Support for wildcard domains in user domain blocks setting.
- Configuration: `quarantined_instances` support wildcard domains.
- Mix Tasks: `mix pleroma.database fix_likes_collections`
- Configuration: `federation_incoming_replies_max_depth` option
### Removed
- Federation: Remove `likes` from objects.
- ActivityPub: The `accept_blocks` configuration setting.
## [1.0.4] - 2019-08-01
### Fixed
- Invalid SemVer version generation, when the current branch does not have commits ahead of tag/checked out on a tag
## [1.0.3] - 2019-07-31
### Security
- OStatus: eliminate the possibility of a protocol downgrade attack.
- OStatus: prevent following locked accounts, bypassing the approval process.
- TwitterAPI: use CommonAPI to handle remote follows instead of OStatus.
### Fixed
- `pleroma_ctl` not detecting the master branch properly. If you get "Releases are built only for master and develop branches" error when updating, please add `-` to the end of the line in `releases/start_erl.data`
## [1.0.2] - 2019-07-28
### Fixed
- Not being able to pin unlisted posts
- Mastodon API: represent poll IDs as strings
- MediaProxy: fix matching filenames
- MediaProxy: fix filename encoding
- Migrations: fix a sporadic migration failure
- Metadata rendering errors resulting in the entire page being inaccessible
- Federation/MediaProxy not working with instances that have wrong certificate order
- ActivityPub S2S: remote user deletions now work the same as local user deletions.
### Changed
- Configuration: OpenGraph and TwitterCard providers enabled by default
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
## [1.0.1] - 2019-07-14
### Security
- OStatus: fix an object spoofing vulnerability.
### Added
- MRF: Support for excluding specific domains from Transparency.
## [1.0.0] - 2019-06-29
### Security
- Mastodon API: Fix display names not being sanitized

View File

@ -194,6 +194,8 @@
send_user_agent: true,
adapter: [
ssl_options: [
# Workaround for remote server certificate chain issues
partial_chain: &:hackney_connect.partial_chain/1,
# We don't support TLS v1.3 yet
versions: [:tlsv1, :"tlsv1.1", :"tlsv1.2"]
]
@ -218,6 +220,7 @@
},
registrations_open: true,
federating: true,
federation_incoming_replies_max_depth: 100,
federation_reachability_timeout_days: 7,
federation_publisher_modules: [
Pleroma.Web.ActivityPub.Publisher,
@ -237,6 +240,7 @@
"text/bbcode"
],
mrf_transparency: true,
mrf_transparency_exclusions: [],
autofollowed_nicknames: [],
max_pinned_statuses: 1,
no_attachment_links: false,
@ -297,7 +301,6 @@
default_mascot: :pleroma_fox_tan
config :pleroma, :activitypub,
accept_blocks: true,
unfollow_blocked: true,
outgoing_blocks: true,
follow_handshake_timeout: 500
@ -331,6 +334,10 @@
config :pleroma, :mrf_subchain, match_actor: %{}
config :pleroma, :mrf_vocabulary,
accept: [],
reject: []
config :pleroma, :rich_media,
enabled: true,
ignore_hosts: [],
@ -358,7 +365,11 @@
port: 9999
config :pleroma, Pleroma.Web.Metadata,
providers: [Pleroma.Web.Metadata.Providers.RelMe],
providers: [
Pleroma.Web.Metadata.Providers.OpenGraph,
Pleroma.Web.Metadata.Providers.TwitterCard,
Pleroma.Web.Metadata.Providers.RelMe
],
unfurl_nsfw: false
config :pleroma, :suggestions,

View File

@ -31,10 +31,11 @@ Feel free to contact us to be added to this list!
- Features: No Streaming
### Fedilab
- Source Code: <https://gitlab.com/tom79/mastalab/>
- Contact: [@tom79@mastodon.social](https://mastodon.social/users/tom79)
- Homepage: <https://fedilab.app/>
- Source Code: <https://framagit.org/tom79/fedilab/>
- Contact: [@fedilab@framapiaf.org](https://framapiaf.org/users/fedilab)
- Platforms: Android
- Features: Streaming Ready
- Features: Streaming Ready, Moderation, Text Formatting
### Nekonium
- Homepage: [F-Droid Repository](https://repo.gdgd.jp.net/), [Google Play](https://play.google.com/store/apps/details?id=com.apps.nekonium), [Amazon](https://www.amazon.co.jp/dp/B076FXPRBC/)

View File

@ -87,6 +87,7 @@ config :pleroma, Pleroma.Emails.Mailer,
* `invites_enabled`: Enable user invitations for admins (depends on `registrations_open: false`).
* `account_activation_required`: Require users to confirm their emails before signing in.
* `federating`: Enable federation with other instances
* `federation_incoming_replies_max_depth`: Max. depth of reply-to activities fetching on incoming federation, to prevent out-of-memory situations while fetching very long threads. If set to `nil`, threads of any depth will be fetched. Lower this value if you experience out-of-memory crashes.
* `federation_reachability_timeout_days`: Timeout (in days) of each external federation target being unreachable prior to pausing federating to it.
* `allow_relay`: Enable Pleromas Relay, which makes it possible to follow a whole instance
* `rewrite_policy`: Message Rewrite Policy, either one or a list. Here are the ones available by default:
@ -98,11 +99,13 @@ config :pleroma, Pleroma.Emails.Mailer,
* `Pleroma.Web.ActivityPub.MRF.RejectNonPublic`: Drops posts with non-public visibility settings (See ``:mrf_rejectnonpublic`` section)
* `Pleroma.Web.ActivityPub.MRF.EnsureRePrepended`: Rewrites posts to ensure that replies to posts with subjects do not have an identical subject and instead begin with re:.
* `Pleroma.Web.ActivityPub.MRF.AntiLinkSpamPolicy`: Rejects posts from likely spambots by rejecting posts from new users that contain links.
* `Pleroma.Web.ActivityPub.MRF.VocabularyPolicy`: Restricts activities to a configured set of vocabulary. (see `:mrf_vocabulary` section)
* `public`: Makes the client API in authentificated mode-only except for user-profiles. Useful for disabling the Local Timeline and The Whole Known Network.
* `quarantined_instances`: List of ActivityPub instances where private(DMs, followers-only) activities will not be send.
* `managed_config`: Whenether the config for pleroma-fe is configured in this config or in ``static/config.json``
* `allowed_post_formats`: MIME-type list of formats allowed to be posted (transformed into HTML)
* `mrf_transparency`: Make the content of your Message Rewrite Facility settings public (via nodeinfo).
* `mrf_transparency_exclusions`: Exclude specific instance names from MRF transparency. The use of the exclusions feature will be disclosed in nodeinfo as a boolean value.
* `scope_copy`: Copy the scope (private/unlisted/public) in replies to posts by default.
* `subject_line_behavior`: Allows changing the default behaviour of subject lines in replies. Valid values:
* "email": Copy and preprend re:, as in email.
@ -265,6 +268,10 @@ config :pleroma, :mrf_subchain,
* `federated_timeline_removal`: A list of patterns which result in message being removed from federated timelines (a.k.a unlisted), each pattern can be a string or a [regular expression](https://hexdocs.pm/elixir/Regex.html)
* `replace`: A list of tuples containing `{pattern, replacement}`, `pattern` can be a string or a [regular expression](https://hexdocs.pm/elixir/Regex.html)
## :mrf_vocabulary
* `accept`: A list of ActivityStreams terms to accept. If empty, all supported messages are accepted.
* `reject`: A list of ActivityStreams terms to reject. If empty, no messages are rejected.
## :media_proxy
* `enabled`: Enables proxying of remote media to the instances proxy
* `base_url`: The base URL to access a user-uploaded file. Useful when you want to proxy the media files via another host/CDN fronts.
@ -318,7 +325,6 @@ config :pleroma, Pleroma.Web.Endpoint,
This will make Pleroma listen on `127.0.0.1` port `8080` and generate urls starting with `https://example.com:2020`
## :activitypub
* ``accept_blocks``: Whether to accept incoming block activities from other instances
* ``unfollow_blocked``: Whether blocks result in people getting unfollowed
* ``outgoing_blocks``: Whether to federate blocks to other instances
* ``deny_follow_blocked``: Whether to disallow following an account that has blocked the user in question

View File

@ -87,7 +87,7 @@ sudo adduser -S -s /bin/false -h /opt/pleroma -H pleroma
```shell
sudo mkdir -p /opt/pleroma
sudo chown -R pleroma:pleroma /opt/pleroma
sudo -Hu pleroma git clone -b master https://git.pleroma.social/pleroma/pleroma /opt/pleroma
sudo -Hu pleroma git clone -b stable https://git.pleroma.social/pleroma/pleroma /opt/pleroma
```
* Change to the new directory:

View File

@ -66,7 +66,7 @@ sudo useradd -r -s /bin/false -m -d /var/lib/pleroma -U pleroma
```shell
sudo mkdir -p /opt/pleroma
sudo chown -R pleroma:pleroma /opt/pleroma
sudo -Hu pleroma git clone -b master https://git.pleroma.social/pleroma/pleroma /opt/pleroma
sudo -Hu pleroma git clone -b stable https://git.pleroma.social/pleroma/pleroma /opt/pleroma
```
* Change to the new directory:

View File

@ -143,7 +143,7 @@ sudo useradd -r -s /bin/false -m -d /var/lib/pleroma -U pleroma
```shell
sudo mkdir -p /opt/pleroma
sudo chown -R pleroma:pleroma /opt/pleroma
sudo -Hu pleroma git clone -b master https://git.pleroma.social/pleroma/pleroma /opt/pleroma
sudo -Hu pleroma git clone -b stable https://git.pleroma.social/pleroma/pleroma /opt/pleroma
```
* Change to the new directory:

View File

@ -68,7 +68,7 @@ sudo useradd -r -s /bin/false -m -d /var/lib/pleroma -U pleroma
```shell
sudo mkdir -p /opt/pleroma
sudo chown -R pleroma:pleroma /opt/pleroma
sudo -Hu pleroma git clone -b master https://git.pleroma.social/pleroma/pleroma /opt/pleroma
sudo -Hu pleroma git clone -b stable https://git.pleroma.social/pleroma/pleroma /opt/pleroma
```
* Change to the new directory:

View File

@ -69,7 +69,9 @@ cd ~
* Gitリポジトリをクローンします。
```
git clone -b master https://git.pleroma.social/pleroma/pleroma
sudo mkdir -p /opt/pleroma
sudo chown -R pleroma:pleroma /opt/pleroma
sudo -Hu pleroma git clone -b stable https://git.pleroma.social/pleroma/pleroma /opt/pleroma
```
* 新しいディレクトリに移動します。

View File

@ -106,7 +106,7 @@ It is highly recommended you use your own fork for the `https://path/to/repo` pa
```shell
pleroma$ cd ~
pleroma$ git clone -b master https://path/to/repo
pleroma$ git clone -b stable https://path/to/repo
```
* Change to the new directory:

View File

@ -49,7 +49,7 @@ mkdir -p /var/lib/pleroma/static
chown -R pleroma /var/lib/pleroma
# If you use the local uploader with default settings your uploads should be located in `~pleroma/uploads`
mv ~pleroma/uploads /var/lib/pleroma/uploads
mv ~pleroma/uploads/* /var/lib/pleroma/uploads
# If you have created the custom public files directory with default settings it should be located in `~pleroma/instance/static`
mv ~pleroma/instance/static /var/lib/pleroma/static
@ -96,9 +96,9 @@ rm -r ~pleroma/*
export FLAVOUR="arm64-musl"
# Clone the release build into a temporary directory and unpack it
# Replace `master` with `develop` if you want to run the develop branch
# Replace `stable` with `unstable` if you want to run the unstable branch
su pleroma -s $SHELL -lc "
curl 'https://git.pleroma.social/api/v4/projects/2/jobs/artifacts/master/download?job=$FLAVOUR' -o /tmp/pleroma.zip
curl 'https://git.pleroma.social/api/v4/projects/2/jobs/artifacts/stable/download?job=$FLAVOUR' -o /tmp/pleroma.zip
unzip /tmp/pleroma.zip -d /tmp/
"
@ -122,13 +122,15 @@ su pleroma -s $SHELL -lc "./bin/pleroma stop"
## Setting up a system service
OTP releases have different service files than from-source installs so they need to be copied over again.
**Warning:** The service files assume pleroma user's home directory is `/opt/pleroma`, please make sure all paths fit your installation.
Debian/Ubuntu:
```sh
# Copy the service into a proper directory
cp ~pleroma/installation/pleroma.service /etc/systemd/system/pleroma.service
# Reload service files
systemctl reload-daemon
systemctl daemon-reload
# Reenable pleroma to start on boot
systemctl reenable pleroma

View File

@ -58,7 +58,7 @@ Clone the repository:
```
$ cd /home/pleroma
$ git clone -b master https://git.pleroma.social/pleroma/pleroma.git
$ git clone -b stable https://git.pleroma.social/pleroma/pleroma.git
```
Configure Pleroma. Note that you need a domain name at this point:

View File

@ -29,7 +29,7 @@ This creates a "pleroma" login class and sets higher values than default for dat
Create the \_pleroma user, assign it the pleroma login class and create its home directory (/home/\_pleroma/): `useradd -m -L pleroma _pleroma`
#### Clone pleroma's directory
Enter a shell as the \_pleroma user. As root, run `su _pleroma -;cd`. Then clone the repository with `git clone -b master https://git.pleroma.social/pleroma/pleroma.git`. Pleroma is now installed in /home/\_pleroma/pleroma/, it will be configured and started at the end of this guide.
Enter a shell as the \_pleroma user. As root, run `su _pleroma -;cd`. Then clone the repository with `git clone -b stable https://git.pleroma.social/pleroma/pleroma.git`. Pleroma is now installed in /home/\_pleroma/pleroma/, it will be configured and started at the end of this guide.
#### Postgresql
Start a shell as the \_postgresql user (as root run `su _postgresql -` then run the `initdb` command to initialize postgresql:

View File

@ -44,7 +44,7 @@ Vaihda pleroma-käyttäjään ja mene kotihakemistoosi:
Lataa pleroman lähdekoodi:
`$ git clone -b master https://git.pleroma.social/pleroma/pleroma.git`
`$ git clone -b stable https://git.pleroma.social/pleroma/pleroma.git`
`$ cd pleroma`

View File

@ -80,7 +80,7 @@ export FLAVOUR="arm64-musl"
# Clone the release build into a temporary directory and unpack it
su pleroma -s $SHELL -lc "
curl 'https://git.pleroma.social/api/v4/projects/2/jobs/artifacts/master/download?job=$FLAVOUR' -o /tmp/pleroma.zip
curl 'https://git.pleroma.social/api/v4/projects/2/jobs/artifacts/stable/download?job=$FLAVOUR' -o /tmp/pleroma.zip
unzip /tmp/pleroma.zip -d /tmp/
"
@ -207,7 +207,7 @@ certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --
# Add it to the daily cron
echo '#!/bin/sh
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --dry-run --post-hook "systemctl reload nginx"
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --post-hook "systemctl reload nginx"
' > /etc/cron.daily/renew-pleroma-cert
chmod +x /etc/cron.daily/renew-pleroma-cert
@ -228,7 +228,7 @@ certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --
# Add it to the daily cron
echo '#!/bin/sh
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --dry-run --post-hook "rc-service nginx reload"
certbot renew --cert-name yourinstance.tld --webroot -w /var/lib/letsencrypt/ --post-hook "rc-service nginx reload"
' > /etc/periodic/daily/renew-pleroma-cert
chmod +x /etc/periodic/daily/renew-pleroma-cert
@ -242,6 +242,14 @@ So for example, if the task is `mix pleroma.user set admin --admin`, you should
```sh
su pleroma -s $SHELL -lc "./bin/pleroma_ctl user set admin --admin"
```
## Create your first user and set as admin
```sh
cd /opt/pleroma/bin
su pleroma -s $SHELL -lc "./bin/pleroma_ctl user new joeuser joeuser@sld.tld --admin"
```
This will create an account withe the username of 'joeuser' with the email address of joeuser@sld.tld, and set that user's account as an admin. This will result in a link that you can paste into the browser, which logs you in and enables you to set the password.
### Updating
Generally, doing the following is enough:
```sh

View File

@ -35,6 +35,10 @@ defmodule Mix.Tasks.Pleroma.Database do
## Remove duplicated items from following and update followers count for all users
mix pleroma.database update_users_following_followers_counts
## Fix the pre-existing "likes" collections for all objects
mix pleroma.database fix_likes_collections
"""
def run(["remove_embedded_objects" | args]) do
{options, [], []} =
@ -119,4 +123,36 @@ def run(["prune_objects" | args]) do
)
end
end
def run(["fix_likes_collections"]) do
import Ecto.Query
start_pleroma()
from(object in Object,
where: fragment("(?)->>'likes' is not null", object.data),
select: %{id: object.id, likes: fragment("(?)->>'likes'", object.data)}
)
|> Pleroma.RepoStreamer.chunk_stream(100)
|> Stream.each(fn objects ->
ids =
objects
|> Enum.filter(fn object -> object.likes |> Jason.decode!() |> is_map() end)
|> Enum.map(& &1.id)
Object
|> where([object], object.id in ^ids)
|> update([object],
set: [
data:
fragment(
"jsonb_set(?, '{likes}', '[]'::jsonb, true)",
object.data
)
]
)
|> Repo.update_all([], timeout: :infinity)
end)
|> Stream.run()
end
end

View File

@ -34,6 +34,8 @@ defmodule Mix.Tasks.Pleroma.Instance do
- `--db-configurable Y/N` - Allow/disallow configuring instance from admin part
- `--uploads-dir` - the directory uploads go in when using a local uploader
- `--static-dir` - the directory custom public files should be read from (custom emojis, frontend bundle overrides, robots.txt, etc.)
- `--listen-ip` - the ip the app should listen to, defaults to 127.0.0.1
- `--listen-port` - the port the app should listen to, defaults to 4000
"""
def run(["gen" | rest]) do
@ -56,7 +58,9 @@ def run(["gen" | rest]) do
indexable: :string,
db_configurable: :string,
uploads_dir: :string,
static_dir: :string
static_dir: :string,
listen_ip: :string,
listen_port: :string
],
aliases: [
o: :output,
@ -146,10 +150,26 @@ def run(["gen" | rest]) do
"n"
) === "y"
listen_port =
get_option(
options,
:listen_port,
"What port will the app listen to (leave it if you are using the default setup with nginx)?",
4000
)
listen_ip =
get_option(
options,
:listen_ip,
"What ip will the app listen to (leave it if you are using the default setup with nginx)?",
"127.0.0.1"
)
uploads_dir =
get_option(
options,
:upload_dir,
:uploads_dir,
"What directory should media uploads go in (when using the local uploader)?",
Pleroma.Config.get([Pleroma.Uploaders.Local, :uploads])
)
@ -186,7 +206,9 @@ def run(["gen" | rest]) do
db_configurable?: db_configurable?,
static_dir: static_dir,
uploads_dir: uploads_dir,
rum_enabled: rum_enabled
rum_enabled: rum_enabled,
listen_ip: listen_ip,
listen_port: listen_port
)
result_psql =

View File

@ -5,6 +5,7 @@
defmodule Mix.Tasks.Pleroma.Relay do
use Mix.Task
import Mix.Pleroma
alias Pleroma.User
alias Pleroma.Web.ActivityPub.Relay
@shortdoc "Manages remote relays"
@ -22,6 +23,10 @@ defmodule Mix.Tasks.Pleroma.Relay do
``mix pleroma.relay unfollow <relay_url>``
Example: ``mix pleroma.relay unfollow https://example.org/relay``
## List relay subscriptions
``mix pleroma.relay list``
"""
def run(["follow", target]) do
start_pleroma()
@ -44,4 +49,19 @@ def run(["unfollow", target]) do
{:error, e} -> shell_error("Error while following #{target}: #{inspect(e)}")
end
end
def run(["list"]) do
start_pleroma()
with %User{} = user <- Relay.get_actor() do
user.following
|> Enum.each(fn entry ->
URI.parse(entry)
|> Map.get(:host)
|> shell_info()
end)
else
e -> shell_error("Error while fetching relay subscription list: #{inspect(e)}")
end
end
end

View File

@ -31,8 +31,8 @@ defmodule Mix.Tasks.Pleroma.User do
mix pleroma.user invite [OPTION...]
Options:
- `--expires_at DATE` - last day on which token is active (e.g. "2019-04-05")
- `--max_use NUMBER` - maximum numbers of token uses
- `--expires-at DATE` - last day on which token is active (e.g. "2019-04-05")
- `--max-use NUMBER` - maximum numbers of token uses
## List generated invites

View File

@ -11,6 +11,7 @@ defmodule Pleroma.HTTP.Connection do
connect_timeout: 10_000,
recv_timeout: 20_000,
follow_redirect: true,
force_redirect: true,
pool: :federation
]
@adapter Application.get_env(:tesla, :adapter)
@ -29,7 +30,7 @@ def new(opts \\ []) do
# fetch Hackney options
#
defp hackney_options(opts) do
def hackney_options(opts) do
options = Keyword.get(opts, :adapter, [])
adapter_options = Pleroma.Config.get([:http, :adapter], [])
proxy_url = Pleroma.Config.get([:http, :proxy_url], nil)

View File

@ -65,10 +65,7 @@ defp process_sni_options(options, url) do
end
def process_request_options(options) do
case Pleroma.Config.get([:http, :proxy_url]) do
nil -> options
proxy -> options ++ [proxy: proxy]
end
Keyword.merge(Pleroma.HTTP.Connection.hackney_options([]), options)
end
@doc """

View File

@ -44,20 +44,20 @@ def get_by_ap_id(ap_id) do
Repo.one(from(object in Object, where: fragment("(?)->>'id' = ?", object.data, ^ap_id)))
end
def normalize(_, fetch_remote \\ true)
def normalize(_, fetch_remote \\ true, options \\ [])
# If we pass an Activity to Object.normalize(), we can try to use the preloaded object.
# Use this whenever possible, especially when walking graphs in an O(N) loop!
def normalize(%Object{} = object, _), do: object
def normalize(%Activity{object: %Object{} = object}, _), do: object
def normalize(%Object{} = object, _, _), do: object
def normalize(%Activity{object: %Object{} = object}, _, _), do: object
# A hack for fake activities
def normalize(%Activity{data: %{"object" => %{"fake" => true} = data}}, _) do
def normalize(%Activity{data: %{"object" => %{"fake" => true} = data}}, _, _) do
%Object{id: "pleroma:fake_object_id", data: data}
end
# Catch and log Object.normalize() calls where the Activity's child object is not
# preloaded.
def normalize(%Activity{data: %{"object" => %{"id" => ap_id}}}, fetch_remote) do
def normalize(%Activity{data: %{"object" => %{"id" => ap_id}}}, fetch_remote, _) do
Logger.debug(
"Object.normalize() called without preloaded object (#{ap_id}). Consider preloading the object!"
)
@ -67,7 +67,7 @@ def normalize(%Activity{data: %{"object" => %{"id" => ap_id}}}, fetch_remote) do
normalize(ap_id, fetch_remote)
end
def normalize(%Activity{data: %{"object" => ap_id}}, fetch_remote) do
def normalize(%Activity{data: %{"object" => ap_id}}, fetch_remote, _) do
Logger.debug(
"Object.normalize() called without preloaded object (#{ap_id}). Consider preloading the object!"
)
@ -78,10 +78,14 @@ def normalize(%Activity{data: %{"object" => ap_id}}, fetch_remote) do
end
# Old way, try fetching the object through cache.
def normalize(%{"id" => ap_id}, fetch_remote), do: normalize(ap_id, fetch_remote)
def normalize(ap_id, false) when is_binary(ap_id), do: get_cached_by_ap_id(ap_id)
def normalize(ap_id, true) when is_binary(ap_id), do: Fetcher.fetch_object_from_id!(ap_id)
def normalize(_, _), do: nil
def normalize(%{"id" => ap_id}, fetch_remote, _), do: normalize(ap_id, fetch_remote)
def normalize(ap_id, false, _) when is_binary(ap_id), do: get_cached_by_ap_id(ap_id)
def normalize(ap_id, true, options) when is_binary(ap_id) do
Fetcher.fetch_object_from_id!(ap_id, options)
end
def normalize(_, _, _), do: nil
# Owned objects can only be mutated by their owner
def authorize_mutation(%Object{data: %{"actor" => actor}}, %User{ap_id: ap_id}),

View File

@ -48,6 +48,9 @@ def contain_origin(id, %{"actor" => _actor} = params) do
end
end
def contain_origin(id, %{"attributedTo" => actor} = params),
do: contain_origin(id, Map.put(params, "actor", actor))
def contain_origin_from_id(_id, %{"id" => nil}), do: :error
def contain_origin_from_id(id, %{"id" => other_id} = _params) do
@ -60,4 +63,9 @@ def contain_origin_from_id(id, %{"id" => other_id} = _params) do
:error
end
end
def contain_child(%{"object" => %{"id" => id, "attributedTo" => _} = object}),
do: contain_origin(id, object)
def contain_child(_), do: :ok
end

View File

@ -22,39 +22,45 @@ defp reinject_object(data) do
# TODO:
# This will create a Create activity, which we need internally at the moment.
def fetch_object_from_id(id) do
def fetch_object_from_id(id, options \\ []) do
if object = Object.get_cached_by_ap_id(id) do
{:ok, object}
else
Logger.info("Fetching #{id} via AP")
with {:ok, data} <- fetch_and_contain_remote_object_from_id(id),
nil <- Object.normalize(data, false),
with {:fetch, {:ok, data}} <- {:fetch, fetch_and_contain_remote_object_from_id(id)},
{:normalize, nil} <- {:normalize, Object.normalize(data, false)},
params <- %{
"type" => "Create",
"to" => data["to"],
"cc" => data["cc"],
# Should we seriously keep this attributedTo thing?
"actor" => data["actor"] || data["attributedTo"],
"object" => data
},
:ok <- Containment.contain_origin(id, params),
{:ok, activity} <- Transmogrifier.handle_incoming(params),
{:containment, :ok} <- {:containment, Containment.contain_origin(id, params)},
{:ok, activity} <- Transmogrifier.handle_incoming(params, options),
{:object, _data, %Object{} = object} <-
{:object, data, Object.normalize(activity, false)} do
{:ok, object}
else
{:containment, _} ->
{:error, "Object containment failed."}
{:error, {:reject, nil}} ->
{:reject, nil}
{:object, data, nil} ->
reinject_object(data)
object = %Object{} ->
{:normalize, object = %Object{}} ->
{:ok, object}
_e ->
# Only fallback when receiving a fetch/normalization error with ActivityPub
Logger.info("Couldn't get object via AP, trying out OStatus fetching...")
# FIXME: OStatus Object Containment?
case OStatus.fetch_activity_from_url(id) do
{:ok, [activity | _]} -> {:ok, Object.normalize(activity, false)}
e -> e
@ -63,8 +69,8 @@ def fetch_object_from_id(id) do
end
end
def fetch_object_from_id!(id) do
with {:ok, object} <- fetch_object_from_id(id) do
def fetch_object_from_id!(id, options \\ []) do
with {:ok, object} <- fetch_object_from_id(id, options) do
object
else
_e ->

View File

@ -61,7 +61,7 @@ defmodule Pleroma.ReverseProxy do
* `http`: options for [hackney](https://github.com/benoitc/hackney).
"""
@default_hackney_options []
@default_hackney_options [pool: :media]
@inline_content_types [
"image/gif",
@ -94,7 +94,8 @@ def call(_conn, _url, _opts \\ [])
def call(conn = %{method: method}, url, opts) when method in @methods do
hackney_opts =
@default_hackney_options
Pleroma.HTTP.Connection.hackney_options([])
|> Keyword.merge(@default_hackney_options)
|> Keyword.merge(Keyword.get(opts, :http, []))
|> HTTP.process_request_options()

View File

@ -836,15 +836,15 @@ def unblock(blocker, %{ap_id: ap_id}) do
def mutes?(nil, _), do: false
def mutes?(user, %{ap_id: ap_id}), do: Enum.member?(user.info.mutes, ap_id)
def blocks?(user, %{ap_id: ap_id}) do
blocks = user.info.blocks
domain_blocks = user.info.domain_blocks
def blocks?(%User{info: info} = _user, %{ap_id: ap_id}) do
blocks = info.blocks
domain_blocks = Pleroma.Web.ActivityPub.MRF.subdomains_regex(info.domain_blocks)
%{host: host} = URI.parse(ap_id)
Enum.member?(blocks, ap_id) ||
Enum.any?(domain_blocks, fn domain ->
host == domain
end)
Pleroma.Web.ActivityPub.MRF.subdomain_match?(domain_blocks, host)
end
def subscribed_to?(user, %{ap_id: ap_id}) do

View File

@ -8,6 +8,7 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
alias Pleroma.Conversation
alias Pleroma.Notification
alias Pleroma.Object
alias Pleroma.Object.Containment
alias Pleroma.Object.Fetcher
alias Pleroma.Pagination
alias Pleroma.Repo
@ -126,6 +127,7 @@ def insert(map, local \\ true, fake \\ false) when is_map(map) do
{:ok, map} <- MRF.filter(map),
{recipients, _, _} = get_recipients(map),
{:fake, false, map, recipients} <- {:fake, fake, map, recipients},
:ok <- Containment.contain_child(map),
{:ok, map, object} <- insert_full_object(map) do
{:ok, activity} =
Repo.insert(%Activity{
@ -272,6 +274,9 @@ def create(%{to: to, actor: actor, context: context, object: object} = params, f
else
{:fake, true, activity} ->
{:ok, activity}
{:error, message} ->
{:error, message}
end
end
@ -728,8 +733,8 @@ defp restrict_state(query, _), do: query
defp restrict_favorited_by(query, %{"favorited_by" => ap_id}) do
from(
activity in query,
where: fragment(~s(? <@ (? #> '{"object","likes"}'\)), ^ap_id, activity.data)
[_activity, object] in query,
where: fragment("(?)->'likes' \\? (?)", object.data, ^ap_id)
)
end
@ -837,7 +842,7 @@ defp restrict_muted_reblogs(query, %{"muting_user" => %User{info: info}}) do
defp restrict_muted_reblogs(query, _), do: query
defp exclude_poll_votes(query, %{"include_poll_votes" => "true"}), do: query
defp exclude_poll_votes(query, %{"include_poll_votes" => true}), do: query
defp exclude_poll_votes(query, _) do
if has_named_binding?(query, :object) do

View File

@ -144,12 +144,43 @@ def followers(conn, %{"nickname" => nickname}) do
end
end
def outbox(conn, %{"nickname" => nickname} = params) do
def outbox(conn, %{"nickname" => nickname, "page" => page?} = params)
when page? in [true, "true"] do
with %User{} = user <- User.get_cached_by_nickname(nickname),
{:ok, user} <- User.ensure_keys_present(user) do
activities =
if params["max_id"] do
ActivityPub.fetch_user_activities(user, nil, %{
"max_id" => params["max_id"],
# This is a hack because postgres generates inefficient queries when filtering by
# 'Answer', poll votes will be hidden by the visibility filter in this case anyway
"include_poll_votes" => true,
"limit" => 10
})
else
ActivityPub.fetch_user_activities(user, nil, %{
"limit" => 10,
"include_poll_votes" => true
})
end
conn
|> put_resp_content_type("application/activity+json")
|> put_view(UserView)
|> render("activity_collection_page.json", %{
activities: activities,
iri: "#{user.ap_id}/outbox"
})
end
end
def outbox(conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname(nickname),
{:ok, user} <- User.ensure_keys_present(user) do
conn
|> put_resp_header("content-type", "application/activity+json")
|> json(UserView.render("outbox.json", %{user: user, max_id: params["max_id"]}))
|> put_resp_content_type("application/activity+json")
|> put_view(UserView)
|> render("activity_collection.json", %{iri: "#{user.ap_id}/outbox"})
end
end
@ -212,18 +243,63 @@ def whoami(%{assigns: %{user: %User{} = user}} = conn, _params) do
def whoami(_conn, _params), do: {:error, :not_found}
def read_inbox(%{assigns: %{user: user}} = conn, %{"nickname" => nickname} = params) do
if nickname == user.nickname do
def read_inbox(
%{assigns: %{user: %{nickname: nickname} = user}} = conn,
%{"nickname" => nickname, "page" => page?} = params
)
when page? in [true, "true"] do
activities =
if params["max_id"] do
ActivityPub.fetch_activities([user.ap_id | user.following], %{
"max_id" => params["max_id"],
"limit" => 10
})
else
ActivityPub.fetch_activities([user.ap_id | user.following], %{"limit" => 10})
end
conn
|> put_resp_content_type("application/activity+json")
|> put_view(UserView)
|> render("activity_collection_page.json", %{
activities: activities,
iri: "#{user.ap_id}/inbox"
})
end
def read_inbox(%{assigns: %{user: %{nickname: nickname} = user}} = conn, %{
"nickname" => nickname
}) do
with {:ok, user} <- User.ensure_keys_present(user) do
conn
|> put_resp_header("content-type", "application/activity+json")
|> json(UserView.render("inbox.json", %{user: user, max_id: params["max_id"]}))
else
conn
|> put_status(:forbidden)
|> json("can't read inbox of #{nickname} as #{user.nickname}")
|> put_resp_content_type("application/activity+json")
|> put_view(UserView)
|> render("activity_collection.json", %{iri: "#{user.ap_id}/inbox"})
end
end
def read_inbox(%{assigns: %{user: nil}} = conn, %{"nickname" => nickname}) do
err = dgettext("errors", "can't read inbox of %{nickname}", nickname: nickname)
conn
|> put_status(:forbidden)
|> json(err)
end
def read_inbox(%{assigns: %{user: %{nickname: as_nickname}}} = conn, %{
"nickname" => nickname
}) do
err =
dgettext("errors", "can't read inbox of %{nickname} as %{as_nickname}",
nickname: nickname,
as_nickname: as_nickname
)
conn
|> put_status(:forbidden)
|> json(err)
end
def handle_user_activity(user, %{"type" => "Create"} = params) do
object =
params["object"]

View File

@ -25,4 +25,46 @@ def get_policies do
defp get_policies(policy) when is_atom(policy), do: [policy]
defp get_policies(policies) when is_list(policies), do: policies
defp get_policies(_), do: []
@spec subdomains_regex([String.t()]) :: [Regex.t()]
def subdomains_regex(domains) when is_list(domains) do
for domain <- domains, do: ~r(^#{String.replace(domain, "*.", "(.*\\.)*")}$)i
end
@spec subdomain_match?([Regex.t()], String.t()) :: boolean()
def subdomain_match?(domains, host) do
Enum.any?(domains, fn domain -> Regex.match?(domain, host) end)
end
@callback describe() :: {:ok | :error, Map.t()}
def describe(policies) do
{:ok, policy_configs} =
policies
|> Enum.reduce({:ok, %{}}, fn
policy, {:ok, data} ->
{:ok, policy_data} = policy.describe()
{:ok, Map.merge(data, policy_data)}
_, error ->
error
end)
mrf_policies =
get_policies()
|> Enum.map(fn policy -> to_string(policy) |> String.split(".") |> List.last() end)
exclusions = Pleroma.Config.get([:instance, :mrf_transparency_exclusions])
base =
%{
mrf_policies: mrf_policies,
exclusions: length(exclusions) > 0
}
|> Map.merge(policy_configs)
{:ok, base}
end
def describe, do: get_policies() |> describe()
end

View File

@ -62,4 +62,7 @@ def filter(%{"type" => "Follow", "actor" => actor_id} = message) do
@impl true
def filter(message), do: {:ok, message}
@impl true
def describe, do: {:ok, %{}}
end

View File

@ -5,6 +5,8 @@
defmodule Pleroma.Web.ActivityPub.MRF.AntiLinkSpamPolicy do
alias Pleroma.User
@behaviour Pleroma.Web.ActivityPub.MRF
require Logger
# has the user successfully posted before?
@ -22,6 +24,7 @@ defp contains_links?(%{"content" => content} = _object) do
defp contains_links?(_), do: false
@impl true
def filter(%{"type" => "Create", "actor" => actor, "object" => object} = message) do
with {:ok, %User{} = u} <- User.get_or_fetch_by_ap_id(actor),
{:contains_links, true} <- {:contains_links, contains_links?(object)},
@ -45,4 +48,7 @@ def filter(%{"type" => "Create", "actor" => actor, "object" => object} = message
# in all other cases, pass through
def filter(message), do: {:ok, message}
@impl true
def describe, do: {:ok, %{}}
end

View File

@ -12,4 +12,7 @@ def filter(object) do
Logger.info("REJECTING #{inspect(object)}")
{:reject, object}
end
@impl true
def describe, do: {:ok, %{}}
end

View File

@ -42,4 +42,6 @@ def filter(%{"type" => activity_type} = object) when activity_type == "Create" d
end
def filter(object), do: {:ok, object}
def describe, do: {:ok, %{}}
end

View File

@ -87,4 +87,8 @@ def filter(%{"type" => "Create"} = message) do
@impl true
def filter(message), do: {:ok, message}
@impl true
def describe,
do: {:ok, %{mrf_hellthread: Pleroma.Config.get(:mrf_hellthread) |> Enum.into(%{})}}
end

View File

@ -94,4 +94,36 @@ def filter(%{"type" => "Create", "object" => %{"content" => _content}} = message
@impl true
def filter(message), do: {:ok, message}
@impl true
def describe do
# This horror is needed to convert regex sigils to strings
mrf_keyword =
Pleroma.Config.get(:mrf_keyword, [])
|> Enum.map(fn {key, value} ->
{key,
Enum.map(value, fn
{pattern, replacement} ->
%{
"pattern" =>
if not is_binary(pattern) do
inspect(pattern)
else
pattern
end,
"replacement" => replacement
}
pattern ->
if not is_binary(pattern) do
inspect(pattern)
else
pattern
end
end)}
end)
|> Enum.into(%{})
{:ok, %{mrf_keyword: mrf_keyword}}
end
end

View File

@ -27,4 +27,7 @@ def filter(
@impl true
def filter(object), do: {:ok, object}
@impl true
def describe, do: {:ok, %{}}
end

View File

@ -10,4 +10,7 @@ defmodule Pleroma.Web.ActivityPub.MRF.NoOpPolicy do
def filter(object) do
{:ok, object}
end
@impl true
def describe, do: {:ok, %{}}
end

View File

@ -25,4 +25,6 @@ def filter(%{"type" => activity_type} = object) when activity_type == "Create" d
end
def filter(object), do: {:ok, object}
def describe, do: {:ok, %{}}
end

View File

@ -48,4 +48,8 @@ def filter(%{"type" => "Create"} = object) do
@impl true
def filter(object), do: {:ok, object}
@impl true
def describe,
do: {:ok, %{mrf_rejectnonpublic: Pleroma.Config.get(:mrf_rejectnonpublic) |> Enum.into(%{})}}
end

View File

@ -4,22 +4,29 @@
defmodule Pleroma.Web.ActivityPub.MRF.SimplePolicy do
alias Pleroma.User
alias Pleroma.Web.ActivityPub.MRF
@moduledoc "Filter activities depending on their origin instance"
@behaviour Pleroma.Web.ActivityPub.MRF
@behaviour MRF
defp check_accept(%{host: actor_host} = _actor_info, object) do
accepts = Pleroma.Config.get([:mrf_simple, :accept])
accepts =
Pleroma.Config.get([:mrf_simple, :accept])
|> MRF.subdomains_regex()
cond do
accepts == [] -> {:ok, object}
actor_host == Pleroma.Config.get([Pleroma.Web.Endpoint, :url, :host]) -> {:ok, object}
Enum.member?(accepts, actor_host) -> {:ok, object}
MRF.subdomain_match?(accepts, actor_host) -> {:ok, object}
true -> {:reject, nil}
end
end
defp check_reject(%{host: actor_host} = _actor_info, object) do
if Enum.member?(Pleroma.Config.get([:mrf_simple, :reject]), actor_host) do
rejects =
Pleroma.Config.get([:mrf_simple, :reject])
|> MRF.subdomains_regex()
if MRF.subdomain_match?(rejects, actor_host) do
{:reject, nil}
else
{:ok, object}
@ -31,8 +38,12 @@ defp check_media_removal(
%{"type" => "Create", "object" => %{"attachment" => child_attachment}} = object
)
when length(child_attachment) > 0 do
media_removal =
Pleroma.Config.get([:mrf_simple, :media_removal])
|> MRF.subdomains_regex()
object =
if Enum.member?(Pleroma.Config.get([:mrf_simple, :media_removal]), actor_host) do
if MRF.subdomain_match?(media_removal, actor_host) do
child_object = Map.delete(object["object"], "attachment")
Map.put(object, "object", child_object)
else
@ -51,8 +62,12 @@ defp check_media_nsfw(
"object" => child_object
} = object
) do
media_nsfw =
Pleroma.Config.get([:mrf_simple, :media_nsfw])
|> MRF.subdomains_regex()
object =
if Enum.member?(Pleroma.Config.get([:mrf_simple, :media_nsfw]), actor_host) do
if MRF.subdomain_match?(media_nsfw, actor_host) do
tags = (child_object["tag"] || []) ++ ["nsfw"]
child_object = Map.put(child_object, "tag", tags)
child_object = Map.put(child_object, "sensitive", true)
@ -67,12 +82,12 @@ defp check_media_nsfw(
defp check_media_nsfw(_actor_info, object), do: {:ok, object}
defp check_ftl_removal(%{host: actor_host} = _actor_info, object) do
timeline_removal =
Pleroma.Config.get([:mrf_simple, :federated_timeline_removal])
|> MRF.subdomains_regex()
object =
with true <-
Enum.member?(
Pleroma.Config.get([:mrf_simple, :federated_timeline_removal]),
actor_host
),
with true <- MRF.subdomain_match?(timeline_removal, actor_host),
user <- User.get_cached_by_ap_id(object["actor"]),
true <- "https://www.w3.org/ns/activitystreams#Public" in object["to"] do
to =
@ -94,7 +109,11 @@ defp check_ftl_removal(%{host: actor_host} = _actor_info, object) do
end
defp check_report_removal(%{host: actor_host} = _actor_info, %{"type" => "Flag"} = object) do
if actor_host in Pleroma.Config.get([:mrf_simple, :report_removal]) do
report_removal =
Pleroma.Config.get([:mrf_simple, :report_removal])
|> MRF.subdomains_regex()
if MRF.subdomain_match?(report_removal, actor_host) do
{:reject, nil}
else
{:ok, object}
@ -104,7 +123,11 @@ defp check_report_removal(%{host: actor_host} = _actor_info, %{"type" => "Flag"}
defp check_report_removal(_actor_info, object), do: {:ok, object}
defp check_avatar_removal(%{host: actor_host} = _actor_info, %{"icon" => _icon} = object) do
if actor_host in Pleroma.Config.get([:mrf_simple, :avatar_removal]) do
avatar_removal =
Pleroma.Config.get([:mrf_simple, :avatar_removal])
|> MRF.subdomains_regex()
if MRF.subdomain_match?(avatar_removal, actor_host) do
{:ok, Map.delete(object, "icon")}
else
{:ok, object}
@ -114,7 +137,11 @@ defp check_avatar_removal(%{host: actor_host} = _actor_info, %{"icon" => _icon}
defp check_avatar_removal(_actor_info, object), do: {:ok, object}
defp check_banner_removal(%{host: actor_host} = _actor_info, %{"image" => _image} = object) do
if actor_host in Pleroma.Config.get([:mrf_simple, :banner_removal]) do
banner_removal =
Pleroma.Config.get([:mrf_simple, :banner_removal])
|> MRF.subdomains_regex()
if MRF.subdomain_match?(banner_removal, actor_host) do
{:ok, Map.delete(object, "image")}
else
{:ok, object}
@ -152,4 +179,16 @@ def filter(%{"id" => actor, "type" => obj_type} = object)
end
def filter(object), do: {:ok, object}
@impl true
def describe do
exclusions = Pleroma.Config.get([:instance, :mrf_transparency_exclusions])
mrf_simple =
Pleroma.Config.get(:mrf_simple)
|> Enum.map(fn {k, v} -> {k, Enum.reject(v, fn v -> v in exclusions end)} end)
|> Enum.into(%{})
{:ok, %{mrf_simple: mrf_simple}}
end
end

View File

@ -37,4 +37,7 @@ def filter(%{"actor" => actor} = message) do
@impl true
def filter(message), do: {:ok, message}
@impl true
def describe, do: {:ok, %{}}
end

View File

@ -149,4 +149,7 @@ def filter(%{"actor" => actor, "type" => "Create"} = message),
@impl true
def filter(message), do: {:ok, message}
@impl true
def describe, do: {:ok, %{}}
end

View File

@ -27,4 +27,13 @@ def filter(%{"actor" => actor} = object) do
end
def filter(object), do: {:ok, object}
@impl true
def describe do
mrf_user_allowlist =
Config.get([:mrf_user_allowlist], [])
|> Enum.into(%{}, fn {k, v} -> {k, length(v)} end)
{:ok, %{mrf_user_allowlist: mrf_user_allowlist}}
end
end

View File

@ -0,0 +1,37 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.MRF.VocabularyPolicy do
@moduledoc "Filter messages which belong to certain activity vocabularies"
@behaviour Pleroma.Web.ActivityPub.MRF
def filter(%{"type" => "Undo", "object" => child_message} = message) do
with {:ok, _} <- filter(child_message) do
{:ok, message}
else
{:reject, nil} ->
{:reject, nil}
end
end
def filter(%{"type" => message_type} = message) do
with accepted_vocabulary <- Pleroma.Config.get([:mrf_vocabulary, :accept]),
rejected_vocabulary <- Pleroma.Config.get([:mrf_vocabulary, :reject]),
true <-
length(accepted_vocabulary) == 0 || Enum.member?(accepted_vocabulary, message_type),
false <-
length(rejected_vocabulary) > 0 && Enum.member?(rejected_vocabulary, message_type),
{:ok, _} <- filter(message["object"]) do
{:ok, message}
else
_ -> {:reject, nil}
end
end
def filter(message), do: {:ok, message}
def describe,
do: {:ok, %{mrf_vocabulary: Pleroma.Config.get(:mrf_vocabulary) |> Enum.into(%{})}}
end

View File

@ -44,7 +44,7 @@ def is_representable?(%Activity{} = activity) do
"""
def publish_one(%{inbox: inbox, json: json, actor: %User{} = actor, id: id} = params) do
Logger.info("Federating #{id} to #{inbox}")
host = URI.parse(inbox).host
%{host: host, path: path} = URI.parse(inbox)
digest = "SHA-256=" <> (:crypto.hash(:sha256, json) |> Base.encode64())
@ -54,6 +54,7 @@ def publish_one(%{inbox: inbox, json: json, actor: %User{} = actor, id: id} = pa
signature =
Pleroma.Signature.sign(actor, %{
"(request-target)": "post #{path}",
host: host,
"content-length": byte_size(json),
digest: digest,
@ -87,8 +88,13 @@ defp should_federate?(inbox, public) do
if public do
true
else
inbox_info = URI.parse(inbox)
!Enum.member?(Config.get([:instance, :quarantined_instances], []), inbox_info.host)
%{host: host} = URI.parse(inbox)
quarantined_instances =
Config.get([:instance, :quarantined_instances], [])
|> Pleroma.Web.ActivityPub.MRF.subdomains_regex()
!Pleroma.Web.ActivityPub.MRF.subdomain_match?(quarantined_instances, host)
end
end

View File

@ -14,6 +14,7 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.ActivityPub.Visibility
alias Pleroma.Web.Federator
import Ecto.Query
@ -22,20 +23,20 @@ defmodule Pleroma.Web.ActivityPub.Transmogrifier do
@doc """
Modifies an incoming AP object (mastodon format) to our internal format.
"""
def fix_object(object) do
def fix_object(object, options \\ []) do
object
|> strip_internal_fields
|> fix_actor
|> fix_url
|> fix_attachments
|> fix_context
|> fix_in_reply_to
|> fix_in_reply_to(options)
|> fix_emoji
|> fix_tag
|> fix_content_map
|> fix_likes
|> fix_addressing
|> fix_summary
|> fix_type
|> fix_type(options)
end
def fix_summary(%{"summary" => nil} = object) do
@ -150,21 +151,9 @@ def fix_actor(%{"attributedTo" => actor} = object) do
|> Map.put("actor", Containment.get_actor(%{"actor" => actor}))
end
# Check for standardisation
# This is what Peertube does
# curl -H 'Accept: application/activity+json' $likes | jq .totalItems
# Prismo returns only an integer (count) as "likes"
def fix_likes(%{"likes" => likes} = object) when not is_map(likes) do
object
|> Map.put("likes", [])
|> Map.put("like_count", 0)
end
def fix_in_reply_to(object, options \\ [])
def fix_likes(object) do
object
end
def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object)
def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object, options)
when not is_nil(in_reply_to) do
in_reply_to_id =
cond do
@ -182,28 +171,34 @@ def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object)
""
end
case get_obj_helper(in_reply_to_id) do
{:ok, replied_object} ->
with %Activity{} = _activity <-
Activity.get_create_by_object_ap_id(replied_object.data["id"]) do
object
|> Map.put("inReplyTo", replied_object.data["id"])
|> Map.put("inReplyToAtomUri", object["inReplyToAtomUri"] || in_reply_to_id)
|> Map.put("conversation", replied_object.data["context"] || object["conversation"])
|> Map.put("context", replied_object.data["context"] || object["conversation"])
else
e ->
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}")
object
end
object = Map.put(object, "inReplyToAtomUri", in_reply_to_id)
e ->
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}")
object
if Federator.allowed_incoming_reply_depth?(options[:depth]) do
case get_obj_helper(in_reply_to_id, options) do
{:ok, replied_object} ->
with %Activity{} = _activity <-
Activity.get_create_by_object_ap_id(replied_object.data["id"]) do
object
|> Map.put("inReplyTo", replied_object.data["id"])
|> Map.put("inReplyToAtomUri", object["inReplyToAtomUri"] || in_reply_to_id)
|> Map.put("conversation", replied_object.data["context"] || object["conversation"])
|> Map.put("context", replied_object.data["context"] || object["conversation"])
else
e ->
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}")
object
end
e ->
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}")
object
end
else
object
end
end
def fix_in_reply_to(object), do: object
def fix_in_reply_to(object, _options), do: object
def fix_context(object) do
context = object["context"] || object["conversation"] || Utils.generate_context_id()
@ -336,17 +331,24 @@ def fix_content_map(%{"contentMap" => content_map} = object) do
def fix_content_map(object), do: object
def fix_type(%{"inReplyTo" => reply_id} = object) when is_binary(reply_id) do
reply = Object.normalize(reply_id)
def fix_type(object, options \\ [])
if reply && (reply.data["type"] == "Question" and object["name"]) do
def fix_type(%{"inReplyTo" => reply_id, "name" => _} = object, options)
when is_binary(reply_id) do
reply =
with true <- Federator.allowed_incoming_reply_depth?(options[:depth]),
{:ok, object} <- get_obj_helper(reply_id, options) do
object
end
if reply && reply.data["type"] == "Question" do
Map.put(object, "type", "Answer")
else
object
end
end
def fix_type(object), do: object
def fix_type(object, _), do: object
defp mastodon_follow_hack(%{"id" => id, "actor" => follower_id}, followed) do
with true <- id =~ "follows",
@ -374,9 +376,11 @@ defp get_follow_activity(follow_object, followed) do
end
end
def handle_incoming(data, options \\ [])
# Flag objects are placed ahead of the ID check because Mastodon 2.8 and earlier send them
# with nil ID.
def handle_incoming(%{"type" => "Flag", "object" => objects, "actor" => actor} = data) do
def handle_incoming(%{"type" => "Flag", "object" => objects, "actor" => actor} = data, _options) do
with context <- data["context"] || Utils.generate_context_id(),
content <- data["content"] || "",
%User{} = actor <- User.get_cached_by_ap_id(actor),
@ -409,15 +413,19 @@ def handle_incoming(%{"type" => "Flag", "object" => objects, "actor" => actor} =
end
# disallow objects with bogus IDs
def handle_incoming(%{"id" => nil}), do: :error
def handle_incoming(%{"id" => ""}), do: :error
def handle_incoming(%{"id" => nil}, _options), do: :error
def handle_incoming(%{"id" => ""}, _options), do: :error
# length of https:// = 8, should validate better, but good enough for now.
def handle_incoming(%{"id" => id}) when not (is_binary(id) and length(id) > 8), do: :error
def handle_incoming(%{"id" => id}, _options) when not (is_binary(id) and length(id) > 8),
do: :error
# TODO: validate those with a Ecto scheme
# - tags
# - emoji
def handle_incoming(%{"type" => "Create", "object" => %{"type" => objtype} = object} = data)
def handle_incoming(
%{"type" => "Create", "object" => %{"type" => objtype} = object} = data,
options
)
when objtype in ["Article", "Note", "Video", "Page", "Question", "Answer"] do
actor = Containment.get_actor(data)
@ -427,7 +435,8 @@ def handle_incoming(%{"type" => "Create", "object" => %{"type" => objtype} = obj
with nil <- Activity.get_create_by_object_ap_id(object["id"]),
{:ok, %User{} = user} <- User.get_or_fetch_by_ap_id(data["actor"]) do
object = fix_object(data["object"])
options = Keyword.put(options, :depth, (options[:depth] || 0) + 1)
object = fix_object(data["object"], options)
params = %{
to: data["to"],
@ -452,7 +461,8 @@ def handle_incoming(%{"type" => "Create", "object" => %{"type" => objtype} = obj
end
def handle_incoming(
%{"type" => "Follow", "object" => followed, "actor" => follower, "id" => id} = data
%{"type" => "Follow", "object" => followed, "actor" => follower, "id" => id} = data,
_options
) do
with %User{local: true} = followed <- User.get_cached_by_ap_id(followed),
{:ok, %User{} = follower} <- User.get_or_fetch_by_ap_id(follower),
@ -503,7 +513,8 @@ def handle_incoming(
end
def handle_incoming(
%{"type" => "Accept", "object" => follow_object, "actor" => _actor, "id" => _id} = data
%{"type" => "Accept", "object" => follow_object, "actor" => _actor, "id" => _id} = data,
_options
) do
with actor <- Containment.get_actor(data),
{:ok, %User{} = followed} <- User.get_or_fetch_by_ap_id(actor),
@ -524,7 +535,8 @@ def handle_incoming(
end
def handle_incoming(
%{"type" => "Reject", "object" => follow_object, "actor" => _actor, "id" => _id} = data
%{"type" => "Reject", "object" => follow_object, "actor" => _actor, "id" => _id} = data,
_options
) do
with actor <- Containment.get_actor(data),
{:ok, %User{} = followed} <- User.get_or_fetch_by_ap_id(actor),
@ -548,7 +560,8 @@ def handle_incoming(
end
def handle_incoming(
%{"type" => "Like", "object" => object_id, "actor" => _actor, "id" => id} = data
%{"type" => "Like", "object" => object_id, "actor" => _actor, "id" => id} = data,
_options
) do
with actor <- Containment.get_actor(data),
{:ok, %User{} = actor} <- User.get_or_fetch_by_ap_id(actor),
@ -561,7 +574,8 @@ def handle_incoming(
end
def handle_incoming(
%{"type" => "Announce", "object" => object_id, "actor" => _actor, "id" => id} = data
%{"type" => "Announce", "object" => object_id, "actor" => _actor, "id" => id} = data,
_options
) do
with actor <- Containment.get_actor(data),
{:ok, %User{} = actor} <- User.get_or_fetch_by_ap_id(actor),
@ -576,7 +590,8 @@ def handle_incoming(
def handle_incoming(
%{"type" => "Update", "object" => %{"type" => object_type} = object, "actor" => actor_id} =
data
data,
_options
)
when object_type in ["Person", "Application", "Service", "Organization"] do
with %User{ap_id: ^actor_id} = actor <- User.get_cached_by_ap_id(object["id"]) do
@ -614,7 +629,8 @@ def handle_incoming(
# an error or a tombstone. This would allow us to verify that a deletion actually took
# place.
def handle_incoming(
%{"type" => "Delete", "object" => object_id, "actor" => _actor, "id" => _id} = data
%{"type" => "Delete", "object" => object_id, "actor" => actor, "id" => _id} = data,
_options
) do
object_id = Utils.get_ap_id(object_id)
@ -625,7 +641,17 @@ def handle_incoming(
{:ok, activity} <- ActivityPub.delete(object, false) do
{:ok, activity}
else
_e -> :error
nil ->
case User.get_cached_by_ap_id(object_id) do
%User{ap_id: ^actor} = user ->
User.delete(user)
nil ->
:error
end
_e ->
:error
end
end
@ -635,7 +661,8 @@ def handle_incoming(
"object" => %{"type" => "Announce", "object" => object_id},
"actor" => _actor,
"id" => id
} = data
} = data,
_options
) do
with actor <- Containment.get_actor(data),
{:ok, %User{} = actor} <- User.get_or_fetch_by_ap_id(actor),
@ -653,7 +680,8 @@ def handle_incoming(
"object" => %{"type" => "Follow", "object" => followed},
"actor" => follower,
"id" => id
} = _data
} = _data,
_options
) do
with %User{local: true} = followed <- User.get_cached_by_ap_id(followed),
{:ok, %User{} = follower} <- User.get_or_fetch_by_ap_id(follower),
@ -671,10 +699,10 @@ def handle_incoming(
"object" => %{"type" => "Block", "object" => blocked},
"actor" => blocker,
"id" => id
} = _data
} = _data,
_options
) do
with true <- Pleroma.Config.get([:activitypub, :accept_blocks]),
%User{local: true} = blocked <- User.get_cached_by_ap_id(blocked),
with %User{local: true} = blocked <- User.get_cached_by_ap_id(blocked),
{:ok, %User{} = blocker} <- User.get_or_fetch_by_ap_id(blocker),
{:ok, activity} <- ActivityPub.unblock(blocker, blocked, id, false) do
User.unblock(blocker, blocked)
@ -685,10 +713,10 @@ def handle_incoming(
end
def handle_incoming(
%{"type" => "Block", "object" => blocked, "actor" => blocker, "id" => id} = _data
%{"type" => "Block", "object" => blocked, "actor" => blocker, "id" => id} = _data,
_options
) do
with true <- Pleroma.Config.get([:activitypub, :accept_blocks]),
%User{local: true} = blocked = User.get_cached_by_ap_id(blocked),
with %User{local: true} = blocked = User.get_cached_by_ap_id(blocked),
{:ok, %User{} = blocker} = User.get_or_fetch_by_ap_id(blocker),
{:ok, activity} <- ActivityPub.block(blocker, blocked, id, false) do
User.unfollow(blocker, blocked)
@ -705,7 +733,8 @@ def handle_incoming(
"object" => %{"type" => "Like", "object" => object_id},
"actor" => _actor,
"id" => id
} = data
} = data,
_options
) do
with actor <- Containment.get_actor(data),
{:ok, %User{} = actor} <- User.get_or_fetch_by_ap_id(actor),
@ -717,10 +746,10 @@ def handle_incoming(
end
end
def handle_incoming(_), do: :error
def handle_incoming(_, _), do: :error
def get_obj_helper(id) do
if object = Object.normalize(id), do: {:ok, object}, else: nil
def get_obj_helper(id, options \\ []) do
if object = Object.normalize(id, true, options), do: {:ok, object}, else: nil
end
def set_reply_to_uri(%{"inReplyTo" => in_reply_to} = object) when is_binary(in_reply_to) do
@ -742,7 +771,6 @@ def prepare_object(object) do
|> add_mention_tags
|> add_emoji_tags
|> add_attributed_to
|> add_likes
|> prepare_attachments
|> set_conversation
|> set_reply_to_uri
@ -926,22 +954,6 @@ def add_attributed_to(object) do
|> Map.put("attributedTo", attributed_to)
end
def add_likes(%{"id" => id, "like_count" => likes} = object) do
likes = %{
"id" => "#{id}/likes",
"first" => "#{id}/likes?page=1",
"type" => "OrderedCollection",
"totalItems" => likes
}
object
|> Map.put("likes", likes)
end
def add_likes(object) do
object
end
def prepare_attachments(object) do
attachments =
(object["attachment"] || [])
@ -957,6 +969,7 @@ def prepare_attachments(object) do
defp strip_internal_fields(object) do
object
|> Map.drop([
"likes",
"like_count",
"announcements",
"announcement_count",

View File

@ -251,20 +251,6 @@ def insert_full_object(%{"object" => %{"type" => type} = object_data} = map)
def insert_full_object(map), do: {:ok, map, nil}
def update_object_in_activities(%{data: %{"id" => id}} = object) do
# TODO
# Update activities that already had this. Could be done in a seperate process.
# Alternatively, just don't do this and fetch the current object each time. Most
# could probably be taken from cache.
relevant_activities = Activity.get_all_create_by_object_ap_id(id)
Enum.map(relevant_activities, fn activity ->
new_activity_data = activity.data |> Map.put("object", object.data)
changeset = Changeset.change(activity, data: new_activity_data)
Repo.update(changeset)
end)
end
#### Like-related helpers
@doc """
@ -347,8 +333,7 @@ def update_element_in_object(property, element, object) do
|> Map.put("#{property}_count", length(element))
|> Map.put("#{property}s", element),
changeset <- Changeset.change(object, data: new_data),
{:ok, object} <- Object.update_and_set_cache(changeset),
_ <- update_object_in_activities(object) do
{:ok, object} <- Object.update_and_set_cache(changeset) do
{:ok, object}
end
end

View File

@ -8,7 +8,6 @@ defmodule Pleroma.Web.ActivityPub.UserView do
alias Pleroma.Keys
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.Endpoint
@ -173,25 +172,22 @@ def render("followers.json", %{user: user}) do
|> Map.merge(Utils.make_json_ld_header())
end
def render("outbox.json", %{user: user, max_id: max_qid}) do
params = %{
"limit" => "10"
def render("activity_collection.json", %{iri: iri}) do
%{
"id" => iri,
"type" => "OrderedCollection",
"first" => "#{iri}?page=true"
}
|> Map.merge(Utils.make_json_ld_header())
end
params =
if max_qid != nil do
Map.put(params, "max_id", max_qid)
else
params
end
activities = ActivityPub.fetch_user_activities(user, nil, params)
def render("activity_collection_page.json", %{activities: activities, iri: iri}) do
# this is sorted chronologically, so first activity is the newest (max)
{max_id, min_id, collection} =
if length(activities) > 0 do
{
Enum.at(Enum.reverse(activities), 0).id,
Enum.at(activities, 0).id,
Enum.at(Enum.reverse(activities), 0).id,
Enum.map(activities, fn act ->
{:ok, data} = Transmogrifier.prepare_outgoing(act.data)
data
@ -205,71 +201,14 @@ def render("outbox.json", %{user: user, max_id: max_qid}) do
}
end
iri = "#{user.ap_id}/outbox"
page = %{
"id" => "#{iri}?max_id=#{max_id}",
%{
"id" => "#{iri}?max_id=#{max_id}&page=true",
"type" => "OrderedCollectionPage",
"partOf" => iri,
"orderedItems" => collection,
"next" => "#{iri}?max_id=#{min_id}"
"next" => "#{iri}?max_id=#{min_id}&page=true"
}
if max_qid == nil do
%{
"id" => iri,
"type" => "OrderedCollection",
"first" => page
}
|> Map.merge(Utils.make_json_ld_header())
else
page |> Map.merge(Utils.make_json_ld_header())
end
end
def render("inbox.json", %{user: user, max_id: max_qid}) do
params = %{
"limit" => "10"
}
params =
if max_qid != nil do
Map.put(params, "max_id", max_qid)
else
params
end
activities = ActivityPub.fetch_activities([user.ap_id | user.following], params)
min_id = Enum.at(Enum.reverse(activities), 0).id
max_id = Enum.at(activities, 0).id
collection =
Enum.map(activities, fn act ->
{:ok, data} = Transmogrifier.prepare_outgoing(act.data)
data
end)
iri = "#{user.ap_id}/inbox"
page = %{
"id" => "#{iri}?max_id=#{max_id}",
"type" => "OrderedCollectionPage",
"partOf" => iri,
"orderedItems" => collection,
"next" => "#{iri}?max_id=#{min_id}"
}
if max_qid == nil do
%{
"id" => iri,
"type" => "OrderedCollection",
"first" => page
}
|> Map.merge(Utils.make_json_ld_header())
else
page |> Map.merge(Utils.make_json_ld_header())
end
|> Map.merge(Utils.make_json_ld_header())
end
def collection(collection, iri, page, show_items \\ true, total \\ nil) do

View File

@ -11,6 +11,7 @@ defmodule Pleroma.Web.CommonAPI do
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.ActivityPub.Visibility
import Pleroma.Web.CommonAPI.Utils
@ -284,12 +285,11 @@ def pin(id_or_ap_id, %{ap_id: user_ap_id} = user) do
},
object: %Object{
data: %{
"to" => object_to,
"type" => "Note"
}
}
} = activity <- get_by_id_or_ap_id(id_or_ap_id),
true <- Enum.member?(object_to, "https://www.w3.org/ns/activitystreams#Public"),
true <- Visibility.is_public?(activity),
%{valid?: true} = info_changeset <-
User.Info.add_pinnned_activity(user.info, activity),
changeset <-

View File

@ -22,6 +22,18 @@ def init do
refresh_subscriptions()
end
@doc "Addresses [memory leaks on recursive replies fetching](https://git.pleroma.social/pleroma/pleroma/issues/161)"
# credo:disable-for-previous-line Credo.Check.Readability.MaxLineLength
def allowed_incoming_reply_depth?(depth) do
max_replies_depth = Pleroma.Config.get([:instance, :federation_incoming_replies_max_depth])
if max_replies_depth do
(depth || 1) <= max_replies_depth
else
true
end
end
# Client API
def incoming_doc(doc) do

View File

@ -28,7 +28,7 @@ def render("mention.json", %{user: user}) do
id: to_string(user.id),
acct: user.nickname,
username: username_from_nickname(user.nickname),
url: user.ap_id
url: User.profile_url(user)
}
end
@ -71,6 +71,13 @@ defp do_render("account.json", %{user: user} = opts) do
image = User.avatar_url(user) |> MediaProxy.url()
header = User.banner_url(user) |> MediaProxy.url()
user_info = User.get_cached_user_info(user)
following_count =
((!user.info.hide_follows or opts[:for] == user) && user_info.following_count) || 0
followers_count =
((!user.info.hide_followers or opts[:for] == user) && user_info.follower_count) || 0
bot = (user.info.source_data["type"] || "Person") in ["Application", "Service"]
emojis =
@ -101,11 +108,11 @@ defp do_render("account.json", %{user: user} = opts) do
display_name: display_name,
locked: user_info.locked,
created_at: Utils.to_masto_date(user.inserted_at),
followers_count: user_info.follower_count,
following_count: user_info.following_count,
followers_count: followers_count,
following_count: following_count,
statuses_count: user_info.note_count,
note: bio || "",
url: user.ap_id,
url: User.profile_url(user),
avatar: image,
avatar_static: image,
header: header,

View File

@ -160,7 +160,7 @@ def render("status.json", %{activity: %{data: %{"object" => _object}} = activity
thread_muted? =
case activity.thread_muted? do
thread_muted? when is_boolean(thread_muted?) -> thread_muted?
nil -> CommonAPI.thread_muted?(user, activity)
nil -> (opts[:for] && CommonAPI.thread_muted?(opts[:for], activity)) || false
end
attachment_data = object.data["attachment"] || []
@ -374,7 +374,7 @@ def render("poll.json", %{object: object} = opts) do
%{
# Mastodon uses separate ids for polls, but an object can't have
# more than one poll embedded so object id is fine
id: object.id,
id: to_string(object.id),
expires_at: Utils.to_masto_date(end_time),
expired: expired,
multiple: multiple,

View File

@ -28,17 +28,17 @@ def remote(conn, %{"sig" => sig64, "url" => url64} = params) do
end
def filename_matches(has_filename, path, url) do
filename =
url
|> MediaProxy.filename()
|> URI.decode()
filename = url |> MediaProxy.filename()
path = URI.decode(path)
if has_filename && filename && Path.basename(path) != filename do
if has_filename && filename && does_not_match(path, filename) do
{:wrong_filename, filename}
else
:ok
end
end
defp does_not_match(path, filename) do
basename = Path.basename(path)
basename != filename and URI.decode(basename) != filename and URI.encode(basename) != filename
end
end

View File

@ -121,4 +121,6 @@ defp build_attachments(%{data: %{"attachment" => attachments}}) do
acc ++ rendered_tags
end)
end
defp build_attachments(_), do: []
end

View File

@ -117,6 +117,8 @@ defp build_attachments(id, %{data: %{"attachment" => attachments}}) do
end)
end
defp build_attachments(_id, _object), do: []
defp player_url(id) do
Pleroma.Web.Router.Helpers.o_status_url(Pleroma.Web.Endpoint, :notice_player, id)
end

View File

@ -34,60 +34,18 @@ def schemas(conn, _params) do
def raw_nodeinfo do
stats = Stats.get_stats()
mrf_simple =
Config.get(:mrf_simple)
|> Enum.into(%{})
# This horror is needed to convert regex sigils to strings
mrf_keyword =
Config.get(:mrf_keyword, [])
|> Enum.map(fn {key, value} ->
{key,
Enum.map(value, fn
{pattern, replacement} ->
%{
"pattern" =>
if not is_binary(pattern) do
inspect(pattern)
else
pattern
end,
"replacement" => replacement
}
pattern ->
if not is_binary(pattern) do
inspect(pattern)
else
pattern
end
end)}
end)
|> Enum.into(%{})
mrf_policies =
MRF.get_policies()
|> Enum.map(fn policy -> to_string(policy) |> String.split(".") |> List.last() end)
quarantined = Config.get([:instance, :quarantined_instances], [])
staff_accounts =
User.all_superusers()
|> Enum.map(fn u -> u.ap_id end)
mrf_user_allowlist =
Config.get([:mrf_user_allowlist], [])
|> Enum.into(%{}, fn {k, v} -> {k, length(v)} end)
federation_response =
if Config.get([:instance, :mrf_transparency]) do
%{
mrf_policies: mrf_policies,
mrf_simple: mrf_simple,
mrf_keyword: mrf_keyword,
mrf_user_allowlist: mrf_user_allowlist,
quarantined_instances: quarantined
}
{:ok, data} = MRF.describe()
data
|> Map.merge(%{quarantined_instances: quarantined})
else
%{}
end

View File

@ -182,6 +182,7 @@ def to_simple_form(%{data: %{"type" => "Announce"}} = activity, user, with_autho
author = if with_author, do: [{:author, UserRepresenter.to_simple_form(user)}], else: []
retweeted_activity = Activity.get_create_by_object_ap_id(activity.data["object"])
retweeted_object = Object.normalize(retweeted_activity)
retweeted_user = User.get_cached_by_ap_id(retweeted_activity.data["actor"])
retweeted_xml = to_simple_form(retweeted_activity, retweeted_user, true)
@ -196,7 +197,7 @@ def to_simple_form(%{data: %{"type" => "Announce"}} = activity, user, with_autho
{:"activity:verb", ['http://activitystrea.ms/schema/1.0/share']},
{:id, h.(activity.data["id"])},
{:title, ['#{user.nickname} repeated a notice']},
{:content, [type: 'html'], ['RT #{retweeted_activity.data["object"]["content"]}']},
{:content, [type: 'html'], ['RT #{retweeted_object.data["content"]}']},
{:published, h.(inserted_at)},
{:updated, h.(updated_at)},
{:"ostatus:conversation", [ref: h.(activity.data["context"])],

View File

@ -9,14 +9,18 @@ defmodule Pleroma.Web.OStatus.FollowHandler do
alias Pleroma.Web.XML
def handle(entry, doc) do
with {:ok, actor} <- OStatus.find_make_or_update_user(doc),
with {:ok, actor} <- OStatus.find_make_or_update_actor(doc),
id when not is_nil(id) <- XML.string_from_xpath("/entry/id", entry),
followed_uri when not is_nil(followed_uri) <-
XML.string_from_xpath("/entry/activity:object/id", entry),
{:ok, followed} <- OStatus.find_or_make_user(followed_uri),
{:locked, false} <- {:locked, followed.info.locked},
{:ok, activity} <- ActivityPub.follow(actor, followed, id, false) do
User.follow(actor, followed)
{:ok, activity}
else
{:locked, true} ->
{:error, "It's not possible to follow locked accounts over OStatus"}
end
end
end

View File

@ -10,6 +10,7 @@ defmodule Pleroma.Web.OStatus.NoteHandler do
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.Federator
alias Pleroma.Web.OStatus
alias Pleroma.Web.XML
@ -88,14 +89,15 @@ def add_external_url(note, entry) do
Map.put(note, "external_url", url)
end
def fetch_replied_to_activity(entry, in_reply_to) do
def fetch_replied_to_activity(entry, in_reply_to, options \\ []) do
with %Activity{} = activity <- Activity.get_create_by_object_ap_id(in_reply_to) do
activity
else
_e ->
with in_reply_to_href when not is_nil(in_reply_to_href) <-
with true <- Federator.allowed_incoming_reply_depth?(options[:depth]),
in_reply_to_href when not is_nil(in_reply_to_href) <-
XML.string_from_xpath("//thr:in-reply-to[1]/@href", entry),
{:ok, [activity | _]} <- OStatus.fetch_activity_from_url(in_reply_to_href) do
{:ok, [activity | _]} <- OStatus.fetch_activity_from_url(in_reply_to_href, options) do
activity
else
_e -> nil
@ -104,15 +106,16 @@ def fetch_replied_to_activity(entry, in_reply_to) do
end
# TODO: Clean this up a bit.
def handle_note(entry, doc \\ nil) do
def handle_note(entry, doc \\ nil, options \\ []) do
with id <- XML.string_from_xpath("//id", entry),
activity when is_nil(activity) <- Activity.get_create_by_object_ap_id_with_object(id),
[author] <- :xmerl_xpath.string('//author[1]', doc),
{:ok, actor} <- OStatus.find_make_or_update_user(author),
{:ok, actor} <- OStatus.find_make_or_update_actor(author),
content_html <- OStatus.get_content(entry),
cw <- OStatus.get_cw(entry),
in_reply_to <- XML.string_from_xpath("//thr:in-reply-to[1]/@ref", entry),
in_reply_to_activity <- fetch_replied_to_activity(entry, in_reply_to),
options <- Keyword.put(options, :depth, (options[:depth] || 0) + 1),
in_reply_to_activity <- fetch_replied_to_activity(entry, in_reply_to, options),
in_reply_to_object <-
(in_reply_to_activity && Object.normalize(in_reply_to_activity)) || nil,
in_reply_to <- (in_reply_to_object && in_reply_to_object.data["id"]) || in_reply_to,

View File

@ -9,7 +9,7 @@ defmodule Pleroma.Web.OStatus.UnfollowHandler do
alias Pleroma.Web.XML
def handle(entry, doc) do
with {:ok, actor} <- OStatus.find_make_or_update_user(doc),
with {:ok, actor} <- OStatus.find_make_or_update_actor(doc),
id when not is_nil(id) <- XML.string_from_xpath("/entry/id", entry),
followed_uri when not is_nil(followed_uri) <-
XML.string_from_xpath("/entry/activity:object/id", entry),

View File

@ -54,9 +54,9 @@ def remote_follow_path do
"#{Web.base_url()}/ostatus_subscribe?acct={uri}"
end
def handle_incoming(xml_string) do
def handle_incoming(xml_string, options \\ []) do
with doc when doc != :error <- parse_document(xml_string) do
with {:ok, actor_user} <- find_make_or_update_user(doc),
with {:ok, actor_user} <- find_make_or_update_actor(doc),
do: Pleroma.Instances.set_reachable(actor_user.ap_id)
entries = :xmerl_xpath.string('//entry', doc)
@ -91,10 +91,12 @@ def handle_incoming(xml_string) do
_ ->
case object_type do
'http://activitystrea.ms/schema/1.0/note' ->
with {:ok, activity} <- NoteHandler.handle_note(entry, doc), do: activity
with {:ok, activity} <- NoteHandler.handle_note(entry, doc, options),
do: activity
'http://activitystrea.ms/schema/1.0/comment' ->
with {:ok, activity} <- NoteHandler.handle_note(entry, doc), do: activity
with {:ok, activity} <- NoteHandler.handle_note(entry, doc, options),
do: activity
_ ->
Logger.error("Couldn't parse incoming document")
@ -118,7 +120,7 @@ def handle_incoming(xml_string) do
end
def make_share(entry, doc, retweeted_activity) do
with {:ok, actor} <- find_make_or_update_user(doc),
with {:ok, actor} <- find_make_or_update_actor(doc),
%Object{} = object <- Object.normalize(retweeted_activity),
id when not is_nil(id) <- string_from_xpath("/entry/id", entry),
{:ok, activity, _object} = ActivityPub.announce(actor, object, id, false) do
@ -136,7 +138,7 @@ def handle_share(entry, doc) do
end
def make_favorite(entry, doc, favorited_activity) do
with {:ok, actor} <- find_make_or_update_user(doc),
with {:ok, actor} <- find_make_or_update_actor(doc),
%Object{} = object <- Object.normalize(favorited_activity),
id when not is_nil(id) <- string_from_xpath("/entry/id", entry),
{:ok, activity, _object} = ActivityPub.like(actor, object, id, false) do
@ -262,11 +264,18 @@ def maybe_update_ostatus(doc, user) do
end
end
def find_make_or_update_user(doc) do
def find_make_or_update_actor(doc) do
uri = string_from_xpath("//author/uri[1]", doc)
with {:ok, user} <- find_or_make_user(uri) do
with {:ok, %User{} = user} <- find_or_make_user(uri),
{:ap_enabled, false} <- {:ap_enabled, User.ap_enabled?(user)} do
maybe_update(doc, user)
else
{:ap_enabled, true} ->
{:error, :invalid_protocol}
_ ->
{:error, :unknown_user}
end
end
@ -359,7 +368,7 @@ def get_atom_url(body) do
end
end
def fetch_activity_from_atom_url(url) do
def fetch_activity_from_atom_url(url, options \\ []) do
with true <- String.starts_with?(url, "http"),
{:ok, %{body: body, status: code}} when code in 200..299 <-
HTTP.get(
@ -367,7 +376,7 @@ def fetch_activity_from_atom_url(url) do
[{:Accept, "application/atom+xml"}]
) do
Logger.debug("Got document from #{url}, handling...")
handle_incoming(body)
handle_incoming(body, options)
else
e ->
Logger.debug("Couldn't get #{url}: #{inspect(e)}")
@ -375,13 +384,13 @@ def fetch_activity_from_atom_url(url) do
end
end
def fetch_activity_from_html_url(url) do
def fetch_activity_from_html_url(url, options \\ []) do
Logger.debug("Trying to fetch #{url}")
with true <- String.starts_with?(url, "http"),
{:ok, %{body: body}} <- HTTP.get(url, []),
{:ok, atom_url} <- get_atom_url(body) do
fetch_activity_from_atom_url(atom_url)
fetch_activity_from_atom_url(atom_url, options)
else
e ->
Logger.debug("Couldn't get #{url}: #{inspect(e)}")
@ -389,11 +398,11 @@ def fetch_activity_from_html_url(url) do
end
end
def fetch_activity_from_url(url) do
with {:ok, [_ | _] = activities} <- fetch_activity_from_atom_url(url) do
def fetch_activity_from_url(url, options \\ []) do
with {:ok, [_ | _] = activities} <- fetch_activity_from_atom_url(url, options) do
{:ok, activities}
else
_e -> fetch_activity_from_html_url(url)
_e -> fetch_activity_from_html_url(url, options)
end
rescue
e ->

View File

@ -684,7 +684,7 @@ defmodule Pleroma.Web.Router do
delete("/auth/sign_out", MastodonAPIController, :logout)
scope [] do
pipe_through(:oauth_read_or_public)
pipe_through(:oauth_read)
get("/web/*path", MastodonAPIController, :index)
end
end
@ -724,6 +724,7 @@ defmodule Pleroma.Web.Router do
defmodule Fallback.RedirectController do
use Pleroma.Web, :controller
require Logger
alias Pleroma.User
alias Pleroma.Web.Metadata
@ -750,7 +751,20 @@ def redirector_with_meta(conn, %{"maybe_nickname_or_id" => maybe_nickname_or_id}
def redirector_with_meta(conn, params) do
{:ok, index_content} = File.read(index_file_path())
tags = Metadata.build_tags(params)
tags =
try do
Metadata.build_tags(params)
rescue
e ->
Logger.error(
"Metadata rendering for #{conn.request_path} failed.\n" <>
Exception.format(:error, e, __STACKTRACE__)
)
""
end
response = String.replace(index_content, "<!--server-generated-meta-->", tags)
conn

View File

@ -36,6 +36,11 @@
margin-bottom: 20px;
}
a {
color: color: #d8a070;
text-decoration: none;
}
form {
width: 100%;
}

View File

@ -9,11 +9,12 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
alias Comeonin.Pbkdf2
alias Pleroma.Activity
alias Pleroma.Config
alias Pleroma.Emoji
alias Pleroma.Healthcheck
alias Pleroma.Notification
alias Pleroma.User
alias Pleroma.Web
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.OStatus
alias Pleroma.Web.WebFinger
@ -23,7 +24,8 @@ def help_test(conn, _params) do
end
def remote_subscribe(conn, %{"nickname" => nick, "profile" => _}) do
with %User{} = user <- User.get_cached_by_nickname(nick), avatar = User.avatar_url(user) do
with %User{} = user <- User.get_cached_by_nickname(nick),
avatar = User.avatar_url(user) do
conn
|> render("subscribe.html", %{nickname: nick, avatar: avatar, error: false})
else
@ -98,8 +100,7 @@ def do_remote_follow(conn, %{
with %User{} = user <- User.get_cached_by_nickname(username),
true <- Pbkdf2.checkpw(password, user.password_hash),
%User{} = _followed <- User.get_cached_by_id(id),
{:ok, follower} <- User.follow(user, followee),
{:ok, _activity} <- ActivityPub.follow(follower, followee) do
{:ok, _follower, _followee, _activity} <- CommonAPI.follow(user, followee) do
conn
|> render("followed.html", %{error: false})
else
@ -120,8 +121,7 @@ def do_remote_follow(conn, %{
def do_remote_follow(%{assigns: %{user: user}} = conn, %{"user" => %{"id" => id}}) do
with %User{} = followee <- User.get_cached_by_id(id),
{:ok, follower} <- User.follow(user, followee),
{:ok, _activity} <- ActivityPub.follow(follower, followee) do
{:ok, _follower, _followee, _activity} <- CommonAPI.follow(user, followee) do
conn
|> render("followed.html", %{error: false})
else
@ -338,20 +338,21 @@ def captcha(conn, _params) do
end
def healthcheck(conn, _params) do
info =
if Pleroma.Config.get([:instance, :healthcheck]) do
Pleroma.Healthcheck.system_info()
else
%{}
end
with true <- Config.get([:instance, :healthcheck]),
%{healthy: true} = info <- Healthcheck.system_info() do
json(conn, info)
else
%{healthy: false} = info ->
service_unavailable(conn, info)
conn =
if info[:healthy] do
conn
else
Plug.Conn.put_status(conn, :service_unavailable)
end
_ ->
service_unavailable(conn, %{})
end
end
json(conn, info)
defp service_unavailable(conn, info) do
conn
|> put_status(:service_unavailable)
|> json(info)
end
end

78
mix.exs
View File

@ -4,7 +4,7 @@ defmodule Pleroma.Mixfile do
def project do
[
app: :pleroma,
version: version("1.0.0"),
version: version("1.0.7"),
elixir: "~> 1.7",
elixirc_paths: elixirc_paths(Mix.env()),
compilers: [:phoenix, :gettext] ++ Mix.compilers(),
@ -95,6 +95,7 @@ defp oauth_deps do
defp deps do
[
{:phoenix, "~> 1.4.8"},
{:tzdata, "~> 0.5.21"},
{:plug_cowboy, "~> 2.0"},
{:phoenix_pubsub, "~> 1.1"},
{:phoenix_ecto, "~> 4.0"},
@ -174,22 +175,27 @@ defp aliases do
# Builds a version string made of:
# * the application version
# * a pre-release if ahead of the tag: the describe string (-count-commithash)
# * build info:
# * branch name
# * build metadata:
# * a build name if `PLEROMA_BUILD_NAME` or `:pleroma, :build_name` is defined
# * the mix environment if different than prod
defp version(version) do
identifier_filter = ~r/[^0-9a-z\-]+/i
# Pre-release version, denoted from patch version with a hyphen
{git_tag, git_pre_release} =
with {tag, 0} <-
System.cmd("git", ["describe", "--tags", "--abbrev=0"], stderr_to_stdout: true),
tag = String.trim(tag),
{describe, 0} <- System.cmd("git", ["describe", "--tags", "--abbrev=8"]),
describe = String.trim(describe),
ahead <- String.replace(describe, tag, "") do
ahead <- String.replace(describe, tag, ""),
ahead <- String.trim_leading(ahead, "-") do
{String.replace_prefix(tag, "v", ""), if(ahead != "", do: String.trim(ahead))}
else
_ ->
{commit_hash, 0} = System.cmd("git", ["rev-parse", "--short", "HEAD"])
{nil, "-0-g" <> String.trim(commit_hash)}
{nil, "0-g" <> String.trim(commit_hash)}
end
if git_tag && version != git_tag do
@ -198,6 +204,23 @@ defp version(version) do
)
end
# Branch name as pre-release version component, denoted with a dot
branch_name =
with {branch_name, 0} <- System.cmd("git", ["rev-parse", "--abbrev-ref", "HEAD"]),
branch_name <- String.trim(branch_name),
branch_name <- System.get_env("PLEROMA_BUILD_BRANCH") || branch_name,
true <-
!Enum.all?(["master", "HEAD", "release/", "stable"], fn name ->
String.starts_with?(name, branch_name)
end) do
branch_name =
branch_name
|> String.trim()
|> String.replace(identifier_filter, "-")
branch_name
end
build_name =
cond do
name = Application.get_env(:pleroma, :build_name) -> name
@ -206,28 +229,37 @@ defp version(version) do
end
env_name = if Mix.env() != :prod, do: to_string(Mix.env())
env_override = System.get_env("PLEROMA_BUILD_ENV")
build =
[build_name, env_name]
|> Enum.filter(fn string -> string && string != "" end)
|> Enum.join("-")
|> (fn
"" -> nil
string -> "+" <> string
end).()
branch_name =
with {branch_name, 0} <- System.cmd("git", ["rev-parse", "--abbrev-ref", "HEAD"]),
branch_name <- System.get_env("PLEROMA_BUILD_BRANCH") || branch_name,
true <- branch_name != "master" do
branch_name =
String.trim(branch_name)
|> String.replace(~r/[^0-9a-z\-\.]+/i, "-")
"-" <> branch_name
env_name =
case env_override do
nil -> env_name
env_override when env_override in ["", "prod"] -> nil
env_override -> env_override
end
[version, git_pre_release, branch_name, build]
# Pre-release version, denoted by appending a hyphen
# and a series of dot separated identifiers
pre_release =
[git_pre_release, branch_name]
|> Enum.filter(fn string -> string && string != "" end)
|> Enum.join(".")
|> (fn
"" -> nil
string -> "-" <> String.replace(string, identifier_filter, "-")
end).()
# Build metadata, denoted with a plus sign
build_metadata =
[build_name, env_name]
|> Enum.filter(fn string -> string && string != "" end)
|> Enum.join(".")
|> (fn
"" -> nil
string -> "+" <> String.replace(string, identifier_filter, "-")
end).()
[version, pre_release, build_metadata]
|> Enum.filter(fn string -> string && string != "" end)
|> Enum.join()
end

View File

@ -6,7 +6,7 @@
"benchee": {:hex, :benchee, "1.0.1", "66b211f9bfd84bd97e6d1beaddf8fc2312aaabe192f776e8931cb0c16f53a521", [:mix], [{:deep_merge, "~> 1.0", [hex: :deep_merge, repo: "hexpm", optional: false]}], "hexpm"},
"bunt": {:hex, :bunt, "0.2.0", "951c6e801e8b1d2cbe58ebbd3e616a869061ddadcc4863d0a2182541acae9a38", [:mix], [], "hexpm"},
"cachex": {:hex, :cachex, "3.0.2", "1351caa4e26e29f7d7ec1d29b53d6013f0447630bbf382b4fb5d5bad0209f203", [:mix], [{:eternal, "~> 1.2", [hex: :eternal, repo: "hexpm", optional: false]}, {:unsafe, "~> 1.0", [hex: :unsafe, repo: "hexpm", optional: false]}], "hexpm"},
"calendar": {:hex, :calendar, "0.17.4", "22c5e8d98a4db9494396e5727108dffb820ee0d18fed4b0aa8ab76e4f5bc32f1", [:mix], [{:tzdata, "~> 0.5.8 or ~> 0.1.201603", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm"},
"calendar": {:hex, :calendar, "0.17.6", "ec291cb2e4ba499c2e8c0ef5f4ace974e2f9d02ae9e807e711a9b0c7850b9aee", [:mix], [{:tzdata, "~> 0.5.20 or ~> 0.1.201603 or ~> 1.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm"},
"certifi": {:hex, :certifi, "2.5.1", "867ce347f7c7d78563450a18a6a28a8090331e77fa02380b4a21962a65d36ee5", [:rebar3], [{:parse_trans, "~>3.3", [hex: :parse_trans, repo: "hexpm", optional: false]}], "hexpm"},
"combine": {:hex, :combine, "0.10.0", "eff8224eeb56498a2af13011d142c5e7997a80c8f5b97c499f84c841032e429f", [:mix], [], "hexpm"},
"comeonin": {:hex, :comeonin, "4.1.1", "c7304fc29b45b897b34142a91122bc72757bc0c295e9e824999d5179ffc08416", [:mix], [{:argon2_elixir, "~> 1.2", [hex: :argon2_elixir, repo: "hexpm", optional: true]}, {:bcrypt_elixir, "~> 0.12.1 or ~> 1.0", [hex: :bcrypt_elixir, repo: "hexpm", optional: true]}, {:pbkdf2_elixir, "~> 0.12", [hex: :pbkdf2_elixir, repo: "hexpm", optional: true]}], "hexpm"},
@ -33,9 +33,9 @@
"ex_syslogger": {:git, "https://github.com/slashmili/ex_syslogger.git", "f3963399047af17e038897c69e20d552e6899e1d", [tag: "1.4.0"]},
"excoveralls": {:hex, :excoveralls, "0.11.1", "dd677fbdd49114fdbdbf445540ec735808250d56b011077798316505064edb2c", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm"},
"floki": {:hex, :floki, "0.20.4", "be42ac911fece24b4c72f3b5846774b6e61b83fe685c2fc9d62093277fb3bc86", [:mix], [{:html_entities, "~> 0.4.0", [hex: :html_entities, repo: "hexpm", optional: false]}, {:mochiweb, "~> 2.15", [hex: :mochiweb, repo: "hexpm", optional: false]}], "hexpm"},
"gen_smtp": {:hex, :gen_smtp, "0.13.0", "11f08504c4bdd831dc520b8f84a1dce5ce624474a797394e7aafd3c29f5dcd25", [:rebar3], [], "hexpm"},
"gettext": {:hex, :gettext, "0.15.0", "40a2b8ce33a80ced7727e36768499fc9286881c43ebafccae6bab731e2b2b8ce", [:mix], [], "hexpm"},
"hackney": {:hex, :hackney, "1.15.1", "9f8f471c844b8ce395f7b6d8398139e26ddca9ebc171a8b91342ee15a19963f4", [:rebar3], [{:certifi, "2.5.1", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "6.0.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "1.0.1", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~>1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "1.1.4", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}], "hexpm"},
"gen_smtp": {:hex, :gen_smtp, "0.14.0", "39846a03522456077c6429b4badfd1d55e5e7d0fdfb65e935b7c5e38549d9202", [:rebar3], [], "hexpm"},
"gettext": {:hex, :gettext, "0.17.0", "abe21542c831887a2b16f4c94556db9c421ab301aee417b7c4fbde7fbdbe01ec", [:mix], [], "hexpm"},
"hackney": {:hex, :hackney, "1.15.2", "07e33c794f8f8964ee86cebec1a8ed88db5070e52e904b8f12209773c1036085", [:rebar3], [{:certifi, "2.5.1", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "6.0.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "1.0.1", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~>1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "1.1.5", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}], "hexpm"},
"html_entities": {:hex, :html_entities, "0.4.0", "f2fee876858cf6aaa9db608820a3209e45a087c5177332799592142b50e89a6b", [:mix], [], "hexpm"},
"html_sanitize_ex": {:hex, :html_sanitize_ex, "1.3.0", "f005ad692b717691203f940c686208aa3d8ffd9dd4bb3699240096a51fa9564e", [:mix], [{:mochiweb, "~> 2.15", [hex: :mochiweb, repo: "hexpm", optional: false]}], "hexpm"},
"http_signatures": {:git, "https://git.pleroma.social/pleroma/http_signatures.git", "9789401987096ead65646b52b5a2ca6bf52fc531", [ref: "9789401987096ead65646b52b5a2ca6bf52fc531"]},
@ -76,14 +76,14 @@
"quack": {:hex, :quack, "0.1.1", "cca7b4da1a233757fdb44b3334fce80c94785b3ad5a602053b7a002b5a8967bf", [:mix], [{:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: false]}, {:tesla, "~> 1.2.0", [hex: :tesla, repo: "hexpm", optional: false]}], "hexpm"},
"ranch": {:hex, :ranch, "1.7.1", "6b1fab51b49196860b733a49c07604465a47bdb78aa10c1c16a3d199f7f8c881", [:rebar3], [], "hexpm"},
"recon": {:git, "https://github.com/ferd/recon.git", "75d70c7c08926d2f24f1ee6de14ee50fe8a52763", [tag: "2.4.0"]},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.4", "f0eafff810d2041e93f915ef59899c923f4568f4585904d010387ed74988e77b", [:make, :mix, :rebar3], [], "hexpm"},
"swoosh": {:hex, :swoosh, "0.20.0", "9a6c13822c9815993c03b6f8fccc370fcffb3c158d9754f67b1fdee6b3a5d928", [:mix], [{:cowboy, "~> 1.0.1 or ~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.12", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mime, "~> 1.1", [hex: :mime, repo: "hexpm", optional: false]}, {:plug, "~> 1.4", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm"},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.5", "6eaf7ad16cb568bb01753dbbd7a95ff8b91c7979482b95f38443fe2c8852a79b", [:make, :mix, :rebar3], [], "hexpm"},
"syslog": {:git, "https://github.com/Vagabond/erlang-syslog.git", "4a6c6f2c996483e86c1320e9553f91d337bcb6aa", [tag: "1.0.5"]},
"telemetry": {:hex, :telemetry, "0.4.0", "8339bee3fa8b91cb84d14c2935f8ecf399ccd87301ad6da6b71c09553834b2ab", [:rebar3], [], "hexpm"},
"tesla": {:hex, :tesla, "1.2.1", "864783cc27f71dd8c8969163704752476cec0f3a51eb3b06393b3971dc9733ff", [:mix], [{:exjsx, ">= 3.0.0", [hex: :exjsx, repo: "hexpm", optional: true]}, {:fuse, "~> 2.4", [hex: :fuse, repo: "hexpm", optional: true]}, {:hackney, "~> 1.6", [hex: :hackney, repo: "hexpm", optional: true]}, {:ibrowse, "~> 4.4.0", [hex: :ibrowse, repo: "hexpm", optional: true]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: true]}, {:mime, "~> 1.0", [hex: :mime, repo: "hexpm", optional: false]}, {:poison, ">= 1.0.0", [hex: :poison, repo: "hexpm", optional: true]}], "hexpm"},
"timex": {:hex, :timex, "3.5.0", "b0a23167da02d0fe4f1a4e104d1f929a00d348502b52432c05de875d0b9cffa5", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.10", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 0.1.8 or ~> 0.5", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm"},
"timex": {:hex, :timex, "3.6.1", "efdf56d0e67a6b956cc57774353b0329c8ab7726766a11547e529357ffdc1d56", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.10", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 0.1.8 or ~> 0.5 or ~> 1.0.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm"},
"trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"},
"tzdata": {:hex, :tzdata, "0.5.20", "304b9e98a02840fb32a43ec111ffbe517863c8566eb04a061f1c4dbb90b4d84c", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm"},
"tzdata": {:hex, :tzdata, "0.5.21", "8cbf3607fcce69636c672d5be2bbb08687fe26639a62bdcc283d267277db7cf0", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm"},
"ueberauth": {:hex, :ueberauth, "0.6.1", "9e90d3337dddf38b1ca2753aca9b1e53d8a52b890191cdc55240247c89230412", [:mix], [{:plug, "~> 1.5", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm"},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.4.1", "d869e4c68901dd9531385bb0c8c40444ebf624e60b6962d95952775cac5e90cd", [:rebar3], [], "hexpm"},
"unsafe": {:hex, :unsafe, "1.0.0", "7c21742cd05380c7875546b023481d3a26f52df8e5dfedcb9f958f322baae305", [:mix], [], "hexpm"},

View File

@ -1,19 +1,31 @@
defmodule Pleroma.Repo.Migrations.CaseInsensivtivity do
use Ecto.Migration
# Two-steps alters are intentional.
# When alter of 2 columns is done in a single operation,
# inconsistent failures happen because of index on `email` column.
def up do
execute ("create extension if not exists citext")
execute("create extension if not exists citext")
alter table(:users) do
modify :email, :citext
modify :nickname, :citext
modify(:email, :citext)
end
alter table(:users) do
modify(:nickname, :citext)
end
end
def down do
alter table(:users) do
modify :email, :string
modify :nickname, :string
modify(:email, :string)
end
execute ("drop extension if exists citext")
alter table(:users) do
modify(:nickname, :string)
end
execute("drop extension if exists citext")
end
end

View File

@ -0,0 +1,7 @@
defmodule Pleroma.Repo.Migrations.AddLikesIndexToObjects do
use Ecto.Migration
def change do
create_if_not_exists index(:objects, ["(data->'likes')"], using: :gin, name: :objects_likes)
end
end

View File

@ -11,6 +11,7 @@ end %>
config :pleroma, Pleroma.Web.Endpoint,
url: [host: "<%= domain %>", scheme: "https", port: <%= port %>],
http: [ip: {<%= String.replace(listen_ip, ".", ", ") %>}, port: <%= listen_port %>],
secret_key_base: "<%= secret %>",
signing_salt: "<%= signing_salt %>"

View File

@ -30,33 +30,67 @@ detect_flavour() {
detect_branch() {
version="$(cut -d' ' -f2 <"$RELEASE_ROOT"/releases/start_erl.data)"
branch="$(echo "$version" | cut -d'-' -f 4)"
# Expected format: major.minor.patch_version(-number_of_commits_ahead_of_tag-gcommit_hash).branch
branch="$(echo "$version" | cut -d'.' -f 4)"
if [ "$branch" = "develop" ]; then
echo "develop"
elif [ "$branch" = "" ]; then
echo "master"
echo "stable"
else
echo "Releases are built only for master and develop branches" >&2
# Note: branch name in version is of SemVer format and may only contain [0-9a-zA-Z-] symbols —
# if supporting releases for more branches, need to ensure they contain only these symbols.
echo "Can't detect the branch automatically, please specify it by using the --branch option." >&2
exit 1
fi
}
update() {
set -e
NO_RM=false
while echo "$1" | grep "^-" >/dev/null; do
case "$1" in
--zip-url)
FULL_URI="$2"
shift 2
;;
--no-rm)
NO_RM=true
shift
;;
--flavour)
FLAVOUR="$2"
shift 2
;;
--branch)
BRANCH="$2"
shift 2
;;
--tmp-dir)
TMP_DIR="$2"
shift 2
;;
-*)
echo "invalid option: $1" 1>&2
shift
;;
esac
done
RELEASE_ROOT=$(dirname "$SCRIPTPATH")
uri="${PLEROMA_CTL_URI:-https://git.pleroma.social}"
project_id="${PLEROMA_CTL_PROJECT_ID:-2}"
project_branch="$(detect_branch)"
flavour="${PLEROMA_CTL_FLAVOUR:-$(detect_flavour)}"
echo "Detected flavour: $flavour"
tmp="${PLEROMA_CTL_TMP_DIR:-/tmp}"
uri="https://git.pleroma.social"
project_id="2"
project_branch="${BRANCH:-$(detect_branch)}"
flavour="${FLAVOUR:-$(detect_flavour)}"
tmp="${TMP_DIR:-/tmp}"
artifact="$tmp/pleroma.zip"
full_uri="${uri}/api/v4/projects/${project_id}/jobs/artifacts/${project_branch}/download?job=${flavour}"
full_uri="${FULL_URI:-${uri}/api/v4/projects/${project_id}/jobs/artifacts/${project_branch}/download?job=${flavour}}"
echo "Downloading the artifact from ${full_uri} to ${artifact}"
curl "$full_uri" -o "${artifact}"
echo "Unpacking ${artifact} to ${tmp}"
unzip -q "$artifact" -d "$tmp"
echo "Copying files over to $RELEASE_ROOT"
if [ "$1" != "--no-rm" ]; then
if [ "$NO_RM" = false ]; then
echo "Removing files from the previous release"
rm -r "${RELEASE_ROOT:-?}"/*
fi
cp -rf "$tmp/release"/* "$RELEASE_ROOT"
@ -83,36 +117,41 @@ if [ -z "$1" ] || [ "$1" = "help" ]; then
Rollback database migrations (needs to be done before downgrading)
update [OPTIONS]
Update the instance using the latest CI artifact for the current branch.
The only supported option is --no-rm, when set the script won't delete the whole directory, but
just force copy over files from the new release. This wastes more space, but may be useful if
some files are stored inside of the release directories (although you really shouldn't store them
there), or if you want to be able to quickly revert a broken update.
The script will try to detect your architecture and ABI and set a flavour automatically,
but if it is wrong, you can overwrite it by setting PLEROMA_CTL_FLAVOUR to the desired flavour.
By default the artifact will be downloaded from https://git.pleroma.social for pleroma/pleroma (project id: 2)
to /tmp/, you can overwrite these settings by setting PLEROMA_CTL_URI, PLEROMA_CTL_PROJECT_ID and PLEROMA_CTL_TMP_DIR
respectively.
Update the instance.
Options:
--branch Update to a specified branch, instead of the latest version of the current one.
--flavour Update to a specified flavour (CPU architecture+libc), instead of the current one.
--zip-url Get the release from a specified url. If set, renders the previous 2 options inactive.
--no-rm Do not erase previous release's files.
--tmp-dir Download the temporary files to a specified directory.
and any mix tasks under Pleroma namespace, for example \`mix pleroma.user COMMAND\` is
equivalent to \`$(basename "$0") user COMMAND\`
By default pleroma_ctl will try calling into a running instance to execute non migration-related commands,
if for some reason this is undesired, set PLEROMA_CTL_RPC_DISABLED environment variable
if for some reason this is undesired, set PLEROMA_CTL_RPC_DISABLED environment variable.
"
else
SCRIPT=$(readlink -f "$0")
SCRIPTPATH=$(dirname "$SCRIPT")
if [ "$1" = "update" ]; then
update "$2"
elif [ "$1" = "migrate" ] || [ "$1" = "rollback" ] || [ "$1" = "create" ] || [ "$1 $2" = "instance gen" ] || [ -n "$PLEROMA_CTL_RPC_DISABLED" ]; then
"$SCRIPTPATH"/pleroma eval 'Pleroma.ReleaseTasks.run("'"$*"'")'
FULL_ARGS="$*"
ACTION="$1"
shift
if [ "$(echo \"$1\" | grep \"^-\" >/dev/null)" = false ]; then
SUBACTION="$1"
shift
fi
if [ "$ACTION" = "update" ]; then
update "$@"
elif [ "$ACTION" = "migrate" ] || [ "$ACTION" = "rollback" ] || [ "$ACTION" = "create" ] || [ "$ACTION $SUBACTION" = "instance gen" ] || [ "$PLEROMA_CTL_RPC_DISABLED" = true ]; then
"$SCRIPTPATH"/pleroma eval 'Pleroma.ReleaseTasks.run("'"$FULL_ARGS"'")'
else
"$SCRIPTPATH"/pleroma rpc 'Pleroma.ReleaseTasks.run("'"$*"'")'
"$SCRIPTPATH"/pleroma rpc 'Pleroma.ReleaseTasks.run("'"$FULL_ARGS"'")'
fi
fi

View File

@ -96,10 +96,10 @@ test "validates signature" do
assert decode_url(sig, base64) == {:error, :invalid_signature}
end
test "filename_matches matches url encoded paths" do
test "filename_matches preserves the encoded or decoded path" do
assert MediaProxyController.filename_matches(
true,
"/Hello%20world.jpg",
"/Hello world.jpg",
"http://pleroma.social/Hello world.jpg"
) == :ok
@ -108,19 +108,22 @@ test "filename_matches matches url encoded paths" do
"/Hello%20world.jpg",
"http://pleroma.social/Hello%20world.jpg"
) == :ok
assert MediaProxyController.filename_matches(
true,
"/my%2Flong%2Furl%2F2019%2F07%2FS.jpg",
"http://pleroma.social/my%2Flong%2Furl%2F2019%2F07%2FS.jpg"
) == :ok
end
test "filename_matches matches non-url encoded paths" do
assert MediaProxyController.filename_matches(
true,
"/Hello world.jpg",
"http://pleroma.social/Hello%20world.jpg"
) == :ok
test "encoded url are tried to match for proxy as `conn.request_path` encodes the url" do
# conn.request_path will return encoded url
request_path = "/ANALYSE-DAI-_-LE-STABLECOIN-100-D%C3%89CENTRALIS%C3%89-BQ.jpg"
assert MediaProxyController.filename_matches(
true,
"/Hello world.jpg",
"http://pleroma.social/Hello world.jpg"
request_path,
"https://mydomain.com/uploads/2019/07/ANALYSE-DAI-_-LE-STABLECOIN-100-DÉCENTRALISÉ-BQ.jpg"
) == :ok
end

View File

@ -531,5 +531,63 @@ test "replying to a deleted post without tagging does not generate a notificatio
assert Enum.empty?(Notification.for_user(user))
end
test "notifications are deleted if a local user is deleted" do
user = insert(:user)
other_user = insert(:user)
{:ok, _activity} =
CommonAPI.post(user, %{"status" => "hi @#{other_user.nickname}", "visibility" => "direct"})
refute Enum.empty?(Notification.for_user(other_user))
User.delete(user)
assert Enum.empty?(Notification.for_user(other_user))
end
test "notifications are deleted if a remote user is deleted" do
remote_user = insert(:user)
local_user = insert(:user)
dm_message = %{
"@context" => "https://www.w3.org/ns/activitystreams",
"type" => "Create",
"actor" => remote_user.ap_id,
"id" => remote_user.ap_id <> "/activities/test",
"to" => [local_user.ap_id],
"cc" => [],
"object" => %{
"type" => "Note",
"content" => "Hello!",
"tag" => [
%{
"type" => "Mention",
"href" => local_user.ap_id,
"name" => "@#{local_user.nickname}"
}
],
"to" => [local_user.ap_id],
"cc" => [],
"attributedTo" => remote_user.ap_id
}
}
{:ok, _dm_activity} = Transmogrifier.handle_incoming(dm_message)
refute Enum.empty?(Notification.for_user(local_user))
delete_user_message = %{
"@context" => "https://www.w3.org/ns/activitystreams",
"id" => remote_user.ap_id <> "/activities/delete",
"actor" => remote_user.ap_id,
"type" => "Delete",
"object" => remote_user.ap_id
}
{:ok, _delete_activity} = Transmogrifier.handle_incoming(delete_user_message)
assert Enum.empty?(Notification.for_user(local_user))
end
end
end

View File

@ -64,4 +64,34 @@ test "users cannot be collided through fake direction spoofing attempts" do
"[error] Could not decode user at fetch https://n1u.moe/users/rye, {:error, :error}"
end
end
describe "containment of children" do
test "contain_child() catches spoofing attempts" do
data = %{
"id" => "http://example.com/whatever",
"type" => "Create",
"object" => %{
"id" => "http://example.net/~alyssa/activities/1234",
"attributedTo" => "http://example.org/~alyssa"
},
"actor" => "http://example.com/~bob"
}
:error = Containment.contain_child(data)
end
test "contain_child() allows correct origins" do
data = %{
"id" => "http://example.org/~alyssa/activities/5678",
"type" => "Create",
"object" => %{
"id" => "http://example.org/~alyssa/activities/1234",
"attributedTo" => "http://example.org/~alyssa"
},
"actor" => "http://example.org/~alyssa"
}
:ok = Containment.contain_child(data)
end
end
end

View File

@ -5,6 +5,7 @@ defmodule Pleroma.Object.FetcherTest do
alias Pleroma.Object
alias Pleroma.Object.Fetcher
import Tesla.Mock
import Mock
setup do
mock(fn
@ -22,16 +23,31 @@ defmodule Pleroma.Object.FetcherTest do
end
describe "actor origin containment" do
test "it rejects objects with a bogus origin" do
test_with_mock "it rejects objects with a bogus origin",
Pleroma.Web.OStatus,
[:passthrough],
[] do
{:error, _} = Fetcher.fetch_object_from_id("https://info.pleroma.site/activity.json")
refute called(Pleroma.Web.OStatus.fetch_activity_from_url(:_))
end
test "it rejects objects when attributedTo is wrong (variant 1)" do
test_with_mock "it rejects objects when attributedTo is wrong (variant 1)",
Pleroma.Web.OStatus,
[:passthrough],
[] do
{:error, _} = Fetcher.fetch_object_from_id("https://info.pleroma.site/activity2.json")
refute called(Pleroma.Web.OStatus.fetch_activity_from_url(:_))
end
test "it rejects objects when attributedTo is wrong (variant 2)" do
test_with_mock "it rejects objects when attributedTo is wrong (variant 2)",
Pleroma.Web.OStatus,
[:passthrough],
[] do
{:error, _} = Fetcher.fetch_object_from_id("https://info.pleroma.site/activity3.json")
refute called(Pleroma.Web.OStatus.fetch_activity_from_url(:_))
end
end

View File

@ -798,6 +798,81 @@ def get("http://404.site" <> _, _, _, _) do
}}
end
def get(
"https://zetsubou.xn--q9jyb4c/.well-known/webfinger?resource=lain@zetsubou.xn--q9jyb4c",
_,
_,
Accept: "application/xrd+xml,application/jrd+json"
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/lain.xml")
}}
end
def get(
"https://zetsubou.xn--q9jyb4c/.well-known/webfinger?resource=https://zetsubou.xn--q9jyb4c/users/lain",
_,
_,
Accept: "application/xrd+xml,application/jrd+json"
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/lain.xml")
}}
end
def get(
"https://zetsubou.xn--q9jyb4c/.well-known/host-meta",
_,
_,
_
) do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/host-meta-zetsubou.xn--q9jyb4c.xml")
}}
end
def get("https://info.pleroma.site/activity.json", _, _, Accept: "application/activity+json") do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/httpoison_mock/https__info.pleroma.site_activity.json")
}}
end
def get("https://info.pleroma.site/activity.json", _, _, _) do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("https://info.pleroma.site/activity2.json", _, _, Accept: "application/activity+json") do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/httpoison_mock/https__info.pleroma.site_activity2.json")
}}
end
def get("https://info.pleroma.site/activity2.json", _, _, _) do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get("https://info.pleroma.site/activity3.json", _, _, Accept: "application/activity+json") do
{:ok,
%Tesla.Env{
status: 200,
body: File.read!("test/fixtures/httpoison_mock/https__info.pleroma.site_activity3.json")
}}
end
def get("https://info.pleroma.site/activity3.json", _, _, _) do
{:ok, %Tesla.Env{status: 404, body: ""}}
end
def get(url, query, body, headers) do
{:error,
"Not implemented the mock response for get #{inspect(url)}, #{query}, #{inspect(body)}, #{

View File

@ -0,0 +1,13 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2018 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule MRFModuleMock do
@behaviour Pleroma.Web.ActivityPub.MRF
@impl true
def filter(message), do: {:ok, message}
@impl true
def describe, do: {:ok, %{mrf_module_mock: "some config data"}}
end

View File

@ -3,8 +3,11 @@
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Mix.Tasks.Pleroma.DatabaseTest do
alias Pleroma.Object
alias Pleroma.Repo
alias Pleroma.User
alias Pleroma.Web.CommonAPI
use Pleroma.DataCase
import Pleroma.Factory
@ -46,4 +49,37 @@ test "following and followers count are updated" do
assert user.info.follower_count == 0
end
end
describe "running fix_likes_collections" do
test "it turns OrderedCollection likes into empty arrays" do
[user, user2] = insert_pair(:user)
{:ok, %{id: id, object: object}} = CommonAPI.post(user, %{"status" => "test"})
{:ok, %{object: object2}} = CommonAPI.post(user, %{"status" => "test test"})
CommonAPI.favorite(id, user2)
likes = %{
"first" =>
"http://mastodon.example.org/objects/dbdbc507-52c8-490d-9b7c-1e1d52e5c132/likes?page=1",
"id" => "http://mastodon.example.org/objects/dbdbc507-52c8-490d-9b7c-1e1d52e5c132/likes",
"totalItems" => 3,
"type" => "OrderedCollection"
}
new_data = Map.put(object2.data, "likes", likes)
object2
|> Ecto.Changeset.change(%{data: new_data})
|> Repo.update()
assert length(Object.get_by_id(object.id).data["likes"]) == 1
assert is_map(Object.get_by_id(object2.id).data["likes"])
assert :ok == Mix.Tasks.Pleroma.Database.run(["fix_likes_collections"])
assert length(Object.get_by_id(object.id).data["likes"]) == 1
assert Enum.empty?(Object.get_by_id(object2.id).data["likes"])
end
end
end

View File

@ -38,7 +38,17 @@ test "running gen" do
"--indexable",
"y",
"--db-configurable",
"y"
"y",
"--rum",
"y",
"--listen-port",
"4000",
"--listen-ip",
"127.0.0.1",
"--uploads-dir",
"test/uploads",
"--static-dir",
"instance/static/"
])
end
@ -56,10 +66,11 @@ test "running gen" do
assert generated_config =~ "username: \"dbuser\""
assert generated_config =~ "password: \"dbpass\""
assert generated_config =~ "dynamic_configuration: true"
assert generated_config =~ "http: [ip: {127, 0, 0, 1}, port: 4000]"
assert File.read!(tmp_path() <> "setup.psql") == generated_setup_psql()
end
defp generated_setup_psql do
~s(CREATE USER dbuser WITH ENCRYPTED PASSWORD 'dbpass';\nCREATE DATABASE dbname OWNER dbuser;\n\\c dbname;\n--Extensions made by ecto.migrate that need superuser access\nCREATE EXTENSION IF NOT EXISTS citext;\nCREATE EXTENSION IF NOT EXISTS pg_trgm;\nCREATE EXTENSION IF NOT EXISTS \"uuid-ossp\";\n)
~s(CREATE USER dbuser WITH ENCRYPTED PASSWORD 'dbpass';\nCREATE DATABASE dbname OWNER dbuser;\n\\c dbname;\n--Extensions made by ecto.migrate that need superuser access\nCREATE EXTENSION IF NOT EXISTS citext;\nCREATE EXTENSION IF NOT EXISTS pg_trgm;\nCREATE EXTENSION IF NOT EXISTS \"uuid-ossp\";\nCREATE EXTENSION IF NOT EXISTS rum;\n)
end
end

View File

@ -799,6 +799,48 @@ test "blocks domains" do
assert User.blocks?(user, collateral_user)
end
test "does not block domain with same end" do
user = insert(:user)
collateral_user =
insert(:user, %{ap_id: "https://another-awful-and-rude-instance.com/user/bully"})
{:ok, user} = User.block_domain(user, "awful-and-rude-instance.com")
refute User.blocks?(user, collateral_user)
end
test "does not block domain with same end if wildcard added" do
user = insert(:user)
collateral_user =
insert(:user, %{ap_id: "https://another-awful-and-rude-instance.com/user/bully"})
{:ok, user} = User.block_domain(user, "*.awful-and-rude-instance.com")
refute User.blocks?(user, collateral_user)
end
test "blocks domain with wildcard for subdomain" do
user = insert(:user)
user_from_subdomain =
insert(:user, %{ap_id: "https://subdomain.awful-and-rude-instance.com/user/bully"})
user_with_two_subdomains =
insert(:user, %{
ap_id: "https://subdomain.second_subdomain.awful-and-rude-instance.com/user/bully"
})
user_domain = insert(:user, %{ap_id: "https://awful-and-rude-instance.com/user/bully"})
{:ok, user} = User.block_domain(user, "*.awful-and-rude-instance.com")
assert User.blocks?(user, user_from_subdomain)
assert User.blocks?(user, user_with_two_subdomains)
assert User.blocks?(user, user_domain)
end
test "unblocks domains" do
user = insert(:user)
collateral_user = insert(:user, %{ap_id: "https://awful-and-rude-instance.com/user/bully"})

View File

@ -308,7 +308,7 @@ test "it returns a note activity in a collection", %{conn: conn} do
conn
|> assign(:user, user)
|> put_req_header("accept", "application/activity+json")
|> get("/users/#{user.nickname}/inbox")
|> get("/users/#{user.nickname}/inbox?page=true")
assert response(conn, 200) =~ note_activity.data["object"]["content"]
end
@ -393,7 +393,7 @@ test "it returns a note activity in a collection", %{conn: conn} do
conn =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/users/#{user.nickname}/outbox")
|> get("/users/#{user.nickname}/outbox?page=true")
assert response(conn, 200) =~ note_activity.data["object"]["content"]
end
@ -405,7 +405,7 @@ test "it returns an announce activity in a collection", %{conn: conn} do
conn =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/users/#{user.nickname}/outbox")
|> get("/users/#{user.nickname}/outbox?page=true")
assert response(conn, 200) =~ announce_activity.data["object"]
end

View File

@ -679,9 +679,6 @@ test "adds a like activity to the db" do
assert like_activity == same_like_activity
assert object.data["likes"] == [user.ap_id]
[note_activity] = Activity.get_all_create_by_object_ap_id(object.data["id"])
assert note_activity.data["object"]["like_count"] == 1
{:ok, _like_activity, object} = ActivityPub.like(user_two, object)
assert object.data["like_count"] == 2
end

View File

@ -0,0 +1,86 @@
defmodule Pleroma.Web.ActivityPub.MRFTest do
use ExUnit.Case, async: true
alias Pleroma.Web.ActivityPub.MRF
test "subdomains_regex/1" do
assert MRF.subdomains_regex(["unsafe.tld", "*.unsafe.tld"]) == [
~r/^unsafe.tld$/i,
~r/^(.*\.)*unsafe.tld$/i
]
end
describe "subdomain_match/2" do
test "common domains" do
regexes = MRF.subdomains_regex(["unsafe.tld", "unsafe2.tld"])
assert regexes == [~r/^unsafe.tld$/i, ~r/^unsafe2.tld$/i]
assert MRF.subdomain_match?(regexes, "unsafe.tld")
assert MRF.subdomain_match?(regexes, "unsafe2.tld")
refute MRF.subdomain_match?(regexes, "example.com")
end
test "wildcard domains with one subdomain" do
regexes = MRF.subdomains_regex(["*.unsafe.tld"])
assert regexes == [~r/^(.*\.)*unsafe.tld$/i]
assert MRF.subdomain_match?(regexes, "unsafe.tld")
assert MRF.subdomain_match?(regexes, "sub.unsafe.tld")
refute MRF.subdomain_match?(regexes, "anotherunsafe.tld")
refute MRF.subdomain_match?(regexes, "unsafe.tldanother")
end
test "wildcard domains with two subdomains" do
regexes = MRF.subdomains_regex(["*.unsafe.tld"])
assert regexes == [~r/^(.*\.)*unsafe.tld$/i]
assert MRF.subdomain_match?(regexes, "unsafe.tld")
assert MRF.subdomain_match?(regexes, "sub.sub.unsafe.tld")
refute MRF.subdomain_match?(regexes, "sub.anotherunsafe.tld")
refute MRF.subdomain_match?(regexes, "sub.unsafe.tldanother")
end
test "matches are case-insensitive" do
regexes = MRF.subdomains_regex(["UnSafe.TLD", "UnSAFE2.Tld"])
assert regexes == [~r/^UnSafe.TLD$/i, ~r/^UnSAFE2.Tld$/i]
assert MRF.subdomain_match?(regexes, "UNSAFE.TLD")
assert MRF.subdomain_match?(regexes, "UNSAFE2.TLD")
assert MRF.subdomain_match?(regexes, "unsafe.tld")
assert MRF.subdomain_match?(regexes, "unsafe2.tld")
refute MRF.subdomain_match?(regexes, "EXAMPLE.COM")
refute MRF.subdomain_match?(regexes, "example.com")
end
end
describe "describe/0" do
test "it works as expected with noop policy" do
expected = %{
mrf_policies: ["NoOpPolicy"],
exclusions: false
}
{:ok, ^expected} = MRF.describe()
end
test "it works as expected with mock policy" do
config = Pleroma.Config.get([:instance, :rewrite_policy])
Pleroma.Config.put([:instance, :rewrite_policy], [MRFModuleMock])
expected = %{
mrf_policies: ["MRFModuleMock"],
mrf_module_mock: "some config data",
exclusions: false
}
{:ok, ^expected} = MRF.describe()
Pleroma.Config.put([:instance, :rewrite_policy], config)
end
end
end

View File

@ -49,6 +49,19 @@ test "has a matching host" do
assert SimplePolicy.filter(local_message) == {:ok, local_message}
end
test "match with wildcard domain" do
Config.put([:mrf_simple, :media_removal], ["*.remote.instance"])
media_message = build_media_message()
local_message = build_local_message()
assert SimplePolicy.filter(media_message) ==
{:ok,
media_message
|> Map.put("object", Map.delete(media_message["object"], "attachment"))}
assert SimplePolicy.filter(local_message) == {:ok, local_message}
end
end
describe "when :media_nsfw" do
@ -74,6 +87,20 @@ test "has a matching host" do
assert SimplePolicy.filter(local_message) == {:ok, local_message}
end
test "match with wildcard domain" do
Config.put([:mrf_simple, :media_nsfw], ["*.remote.instance"])
media_message = build_media_message()
local_message = build_local_message()
assert SimplePolicy.filter(media_message) ==
{:ok,
media_message
|> put_in(["object", "tag"], ["foo", "nsfw"])
|> put_in(["object", "sensitive"], true)}
assert SimplePolicy.filter(local_message) == {:ok, local_message}
end
end
defp build_media_message do
@ -106,6 +133,15 @@ test "has a matching host" do
assert SimplePolicy.filter(report_message) == {:reject, nil}
assert SimplePolicy.filter(local_message) == {:ok, local_message}
end
test "match with wildcard domain" do
Config.put([:mrf_simple, :report_removal], ["*.remote.instance"])
report_message = build_report_message()
local_message = build_local_message()
assert SimplePolicy.filter(report_message) == {:reject, nil}
assert SimplePolicy.filter(local_message) == {:ok, local_message}
end
end
defp build_report_message do
@ -146,6 +182,27 @@ test "has a matching host" do
assert SimplePolicy.filter(local_message) == {:ok, local_message}
end
test "match with wildcard domain" do
{actor, ftl_message} = build_ftl_actor_and_message()
ftl_message_actor_host =
ftl_message
|> Map.fetch!("actor")
|> URI.parse()
|> Map.fetch!(:host)
Config.put([:mrf_simple, :federated_timeline_removal], ["*." <> ftl_message_actor_host])
local_message = build_local_message()
assert {:ok, ftl_message} = SimplePolicy.filter(ftl_message)
assert actor.follower_address in ftl_message["to"]
refute actor.follower_address in ftl_message["cc"]
refute "https://www.w3.org/ns/activitystreams#Public" in ftl_message["to"]
assert "https://www.w3.org/ns/activitystreams#Public" in ftl_message["cc"]
assert SimplePolicy.filter(local_message) == {:ok, local_message}
end
test "has a matching host but only as:Public in to" do
{_actor, ftl_message} = build_ftl_actor_and_message()
@ -192,6 +249,14 @@ test "has a matching host" do
assert SimplePolicy.filter(remote_message) == {:reject, nil}
end
test "match with wildcard domain" do
Config.put([:mrf_simple, :reject], ["*.remote.instance"])
remote_message = build_remote_message()
assert SimplePolicy.filter(remote_message) == {:reject, nil}
end
end
describe "when :accept" do
@ -224,6 +289,16 @@ test "has a matching host" do
assert SimplePolicy.filter(local_message) == {:ok, local_message}
assert SimplePolicy.filter(remote_message) == {:ok, remote_message}
end
test "match with wildcard domain" do
Config.put([:mrf_simple, :accept], ["*.remote.instance"])
local_message = build_local_message()
remote_message = build_remote_message()
assert SimplePolicy.filter(local_message) == {:ok, local_message}
assert SimplePolicy.filter(remote_message) == {:ok, remote_message}
end
end
describe "when :avatar_removal" do
@ -251,6 +326,15 @@ test "has a matching host" do
refute filtered["icon"]
end
test "match with wildcard domain" do
Config.put([:mrf_simple, :avatar_removal], ["*.remote.instance"])
remote_user = build_remote_user()
{:ok, filtered} = SimplePolicy.filter(remote_user)
refute filtered["icon"]
end
end
describe "when :banner_removal" do
@ -278,6 +362,15 @@ test "has a matching host" do
refute filtered["image"]
end
test "match with wildcard domain" do
Config.put([:mrf_simple, :banner_removal], ["*.remote.instance"])
remote_user = build_remote_user()
{:ok, filtered} = SimplePolicy.filter(remote_user)
refute filtered["image"]
end
end
defp build_local_message do

View File

@ -0,0 +1,123 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Web.ActivityPub.MRF.VocabularyPolicyTest do
use Pleroma.DataCase
alias Pleroma.Web.ActivityPub.MRF.VocabularyPolicy
describe "accept" do
test "it accepts based on parent activity type" do
config = Pleroma.Config.get([:mrf_vocabulary, :accept])
Pleroma.Config.put([:mrf_vocabulary, :accept], ["Like"])
message = %{
"type" => "Like",
"object" => "whatever"
}
{:ok, ^message} = VocabularyPolicy.filter(message)
Pleroma.Config.put([:mrf_vocabulary, :accept], config)
end
test "it accepts based on child object type" do
config = Pleroma.Config.get([:mrf_vocabulary, :accept])
Pleroma.Config.put([:mrf_vocabulary, :accept], ["Create", "Note"])
message = %{
"type" => "Create",
"object" => %{
"type" => "Note",
"content" => "whatever"
}
}
{:ok, ^message} = VocabularyPolicy.filter(message)
Pleroma.Config.put([:mrf_vocabulary, :accept], config)
end
test "it does not accept disallowed child objects" do
config = Pleroma.Config.get([:mrf_vocabulary, :accept])
Pleroma.Config.put([:mrf_vocabulary, :accept], ["Create", "Note"])
message = %{
"type" => "Create",
"object" => %{
"type" => "Article",
"content" => "whatever"
}
}
{:reject, nil} = VocabularyPolicy.filter(message)
Pleroma.Config.put([:mrf_vocabulary, :accept], config)
end
test "it does not accept disallowed parent types" do
config = Pleroma.Config.get([:mrf_vocabulary, :accept])
Pleroma.Config.put([:mrf_vocabulary, :accept], ["Announce", "Note"])
message = %{
"type" => "Create",
"object" => %{
"type" => "Note",
"content" => "whatever"
}
}
{:reject, nil} = VocabularyPolicy.filter(message)
Pleroma.Config.put([:mrf_vocabulary, :accept], config)
end
end
describe "reject" do
test "it rejects based on parent activity type" do
config = Pleroma.Config.get([:mrf_vocabulary, :reject])
Pleroma.Config.put([:mrf_vocabulary, :reject], ["Like"])
message = %{
"type" => "Like",
"object" => "whatever"
}
{:reject, nil} = VocabularyPolicy.filter(message)
Pleroma.Config.put([:mrf_vocabulary, :reject], config)
end
test "it rejects based on child object type" do
config = Pleroma.Config.get([:mrf_vocabulary, :reject])
Pleroma.Config.put([:mrf_vocabulary, :reject], ["Note"])
message = %{
"type" => "Create",
"object" => %{
"type" => "Note",
"content" => "whatever"
}
}
{:reject, nil} = VocabularyPolicy.filter(message)
Pleroma.Config.put([:mrf_vocabulary, :reject], config)
end
test "it passes through objects that aren't disallowed" do
config = Pleroma.Config.get([:mrf_vocabulary, :reject])
Pleroma.Config.put([:mrf_vocabulary, :reject], ["Like"])
message = %{
"type" => "Announce",
"object" => "whatever"
}
{:ok, ^message} = VocabularyPolicy.filter(message)
Pleroma.Config.put([:mrf_vocabulary, :reject], config)
end
end
end

View File

@ -11,12 +11,13 @@ defmodule Pleroma.Web.ActivityPub.TransmogrifierTest do
alias Pleroma.User
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.OStatus
alias Pleroma.Web.Websub.WebsubClientSubscription
import Mock
import Pleroma.Factory
import ExUnit.CaptureLog
alias Pleroma.Web.CommonAPI
setup_all do
Tesla.Mock.mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end)
@ -46,12 +47,10 @@ test "it fetches replied-to activities if we don't have them" do
data["object"]
|> Map.put("inReplyTo", "https://shitposter.club/notice/2827873")
data =
data
|> Map.put("object", object)
data = Map.put(data, "object", object)
{:ok, returned_activity} = Transmogrifier.handle_incoming(data)
returned_object = Object.normalize(returned_activity.data["object"])
returned_object = Object.normalize(returned_activity.data["object"], false)
assert activity =
Activity.get_create_by_object_ap_id(
@ -61,6 +60,32 @@ test "it fetches replied-to activities if we don't have them" do
assert returned_object.data["inReplyToAtomUri"] == "https://shitposter.club/notice/2827873"
end
test "it does not fetch replied-to activities beyond max_replies_depth" do
data =
File.read!("test/fixtures/mastodon-post-activity.json")
|> Poison.decode!()
object =
data["object"]
|> Map.put("inReplyTo", "https://shitposter.club/notice/2827873")
data = Map.put(data, "object", object)
with_mock Pleroma.Web.Federator,
allowed_incoming_reply_depth?: fn _ -> false end do
{:ok, returned_activity} = Transmogrifier.handle_incoming(data)
returned_object = Object.normalize(returned_activity.data["object"], false)
refute Activity.get_create_by_object_ap_id(
"tag:shitposter.club,2017-05-05:noticeId=2827873:objectType=comment"
)
assert returned_object.data["inReplyToAtomUri"] ==
"https://shitposter.club/notice/2827873"
end
end
test "it does not crash if the object in inReplyTo can't be fetched" do
data =
File.read!("test/fixtures/mastodon-post-activity.json")
@ -390,6 +415,7 @@ test "it ensures that as:Public activities make it to their followers collection
|> Map.put("attributedTo", user.ap_id)
|> Map.put("to", ["https://www.w3.org/ns/activitystreams#Public"])
|> Map.put("cc", [])
|> Map.put("id", user.ap_id <> "/activities/12345678")
data = Map.put(data, "object", object)
@ -413,6 +439,7 @@ test "it ensures that address fields become lists" do
|> Map.put("attributedTo", user.ap_id)
|> Map.put("to", nil)
|> Map.put("cc", nil)
|> Map.put("id", user.ap_id <> "/activities/12345678")
data = Map.put(data, "object", object)
@ -422,6 +449,27 @@ test "it ensures that address fields become lists" do
assert !is_nil(data["cc"])
end
test "it strips internal likes" do
data =
File.read!("test/fixtures/mastodon-post-activity.json")
|> Poison.decode!()
likes = %{
"first" =>
"http://mastodon.example.org/objects/dbdbc507-52c8-490d-9b7c-1e1d52e5c132/likes?page=1",
"id" => "http://mastodon.example.org/objects/dbdbc507-52c8-490d-9b7c-1e1d52e5c132/likes",
"totalItems" => 3,
"type" => "OrderedCollection"
}
object = Map.put(data["object"], "likes", likes)
data = Map.put(data, "object", object)
{:ok, %Activity{object: object}} = Transmogrifier.handle_incoming(data)
refute Map.has_key?(object.data, "likes")
end
test "it works for incoming update activities" do
data = File.read!("test/fixtures/mastodon-post-activity.json") |> Poison.decode!()
@ -1008,14 +1056,7 @@ test "it strips internal fields of article" do
assert is_nil(modified["object"]["announcements"])
assert is_nil(modified["object"]["announcement_count"])
assert is_nil(modified["object"]["context_id"])
end
test "it adds like collection to object" do
activity = insert(:note_activity)
{:ok, modified} = Transmogrifier.prepare_outgoing(activity.data)
assert modified["object"]["likes"]["type"] == "OrderedCollection"
assert modified["object"]["likes"]["totalItems"] == 0
assert is_nil(modified["object"]["likes"])
end
test "the directMessage flag is present" do

View File

@ -4,6 +4,7 @@ defmodule Pleroma.Web.ActivityPub.UserViewTest do
alias Pleroma.User
alias Pleroma.Web.ActivityPub.UserView
alias Pleroma.Web.CommonAPI
test "Renders a user, including the public key" do
user = insert(:user)
@ -78,4 +79,35 @@ test "instance users do not expose oAuth endpoints" do
refute result["endpoints"]["oauthTokenEndpoint"]
end
end
test "activity collection page aginates correctly" do
user = insert(:user)
posts =
for i <- 0..25 do
{:ok, activity} = CommonAPI.post(user, %{"status" => "post #{i}"})
activity
end
# outbox sorts chronologically, newest first, with ten per page
posts = Enum.reverse(posts)
%{"next" => next_url} =
UserView.render("activity_collection_page.json", %{
iri: "#{user.ap_id}/outbox",
activities: Enum.take(posts, 10)
})
next_id = Enum.at(posts, 9).id
assert next_url =~ next_id
%{"next" => next_url} =
UserView.render("activity_collection_page.json", %{
iri: "#{user.ap_id}/outbox",
activities: Enum.take(Enum.drop(posts, 10), 10)
})
next_id = Enum.at(posts, 19).id
assert next_url =~ next_id
end
end

View File

@ -188,6 +188,11 @@ test "pin status", %{user: user, activity: activity} do
assert %User{info: %{pinned_activities: [^id]}} = user
end
test "unlisted statuses can be pinned", %{user: user} do
{:ok, activity} = CommonAPI.post(user, %{"status" => "HI!!!", "visibility" => "unlisted"})
assert {:ok, ^activity} = CommonAPI.pin(activity.id, user)
end
test "only self-authored can be pinned", %{activity: activity} do
user = insert(:user)

View File

@ -213,5 +213,21 @@ test "rejects incoming AP docs with incorrect origin" do
:error = Federator.incoming_ap_doc(params)
end
test "it does not crash if MRF rejects the post" do
policies = Pleroma.Config.get([:instance, :rewrite_policy])
mrf_keyword_policy = Pleroma.Config.get(:mrf_keyword)
Pleroma.Config.put([:mrf_keyword, :reject], ["lain"])
Pleroma.Config.put([:instance, :rewrite_policy], Pleroma.Web.ActivityPub.MRF.KeywordPolicy)
params =
File.read!("test/fixtures/mastodon-post-activity.json")
|> Poison.decode!()
assert Federator.incoming_ap_doc(params) == :error
Pleroma.Config.put([:instance, :rewrite_policy], policies)
Pleroma.Config.put(:mrf_keyword, mrf_keyword_policy)
end
end
end

View File

@ -6,6 +6,7 @@ defmodule Pleroma.Web.MastodonAPI.AccountViewTest do
use Pleroma.DataCase
import Pleroma.Factory
alias Pleroma.User
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.MastodonAPI.AccountView
test "Represent a user account" do
@ -275,4 +276,31 @@ test "sanitizes display names" do
result = AccountView.render("account.json", %{user: user})
refute result.display_name == "<marquee> username </marquee>"
end
describe "hiding follows/following" do
test "shows when follows/following are hidden and sets follower/following count to 0" do
user = insert(:user, info: %{hide_followers: true, hide_follows: true})
other_user = insert(:user)
{:ok, user, other_user, _activity} = CommonAPI.follow(user, other_user)
{:ok, _other_user, user, _activity} = CommonAPI.follow(other_user, user)
assert %{
followers_count: 0,
following_count: 0,
pleroma: %{hide_follows: true, hide_followers: true}
} = AccountView.render("account.json", %{user: user})
end
test "shows actual follower/following count to the account owner" do
user = insert(:user, info: %{hide_followers: true, hide_follows: true})
other_user = insert(:user)
{:ok, user, other_user, _activity} = CommonAPI.follow(user, other_user)
{:ok, _other_user, user, _activity} = CommonAPI.follow(other_user, user)
assert %{
followers_count: 1,
following_count: 1
} = AccountView.render("account.json", %{user: user, for: user})
end
end
end

View File

@ -2684,8 +2684,10 @@ test "bookmarks" do
describe "conversation muting" do
setup do
post_user = insert(:user)
user = insert(:user)
{:ok, activity} = CommonAPI.post(user, %{"status" => "HIE"})
{:ok, activity} = CommonAPI.post(post_user, %{"status" => "HIE"})
[user: user, activity: activity]
end
@ -2877,6 +2879,21 @@ test "redirects not logged-in users to the login page", %{conn: conn, path: path
assert redirected_to(conn) == "/web/login"
end
test "redirects not logged-in users to the login page on private instances", %{
conn: conn,
path: path
} do
is_public = Pleroma.Config.get([:instance, :public])
Pleroma.Config.put([:instance, :public], false)
conn = get(conn, path)
assert conn.status == 302
assert redirected_to(conn) == "/web/login"
Pleroma.Config.put([:instance, :public], is_public)
end
test "does not redirect logged in users to the login page", %{conn: conn, path: path} do
token = insert(:oauth_token)
@ -3300,7 +3317,7 @@ test "returns poll entity for object id", %{conn: conn} do
|> get("/api/v1/polls/#{object.id}")
response = json_response(conn, 200)
id = object.id
id = to_string(object.id)
assert %{"id" => ^id, "expired" => false, "multiple" => false} = response
end

View File

@ -361,7 +361,7 @@ test "renders a poll" do
expected = %{
emojis: [],
expired: false,
id: object.id,
id: to_string(object.id),
multiple: false,
options: [
%{title: "absolutely!", votes_count: 0},

View File

@ -83,4 +83,55 @@ test "it returns the safe_dm_mentions feature if enabled", %{conn: conn} do
Pleroma.Config.put([:instance, :safe_dm_mentions], option)
end
test "it shows MRF transparency data if enabled", %{conn: conn} do
config = Pleroma.Config.get([:instance, :rewrite_policy])
Pleroma.Config.put([:instance, :rewrite_policy], [Pleroma.Web.ActivityPub.MRF.SimplePolicy])
option = Pleroma.Config.get([:instance, :mrf_transparency])
Pleroma.Config.put([:instance, :mrf_transparency], true)
simple_config = %{"reject" => ["example.com"]}
Pleroma.Config.put(:mrf_simple, simple_config)
response =
conn
|> get("/nodeinfo/2.1.json")
|> json_response(:ok)
assert response["metadata"]["federation"]["mrf_simple"] == simple_config
Pleroma.Config.put([:instance, :rewrite_policy], config)
Pleroma.Config.put([:instance, :mrf_transparency], option)
Pleroma.Config.put(:mrf_simple, %{})
end
test "it performs exclusions from MRF transparency data if configured", %{conn: conn} do
config = Pleroma.Config.get([:instance, :rewrite_policy])
Pleroma.Config.put([:instance, :rewrite_policy], [Pleroma.Web.ActivityPub.MRF.SimplePolicy])
option = Pleroma.Config.get([:instance, :mrf_transparency])
Pleroma.Config.put([:instance, :mrf_transparency], true)
exclusions = Pleroma.Config.get([:instance, :mrf_transparency_exclusions])
Pleroma.Config.put([:instance, :mrf_transparency_exclusions], ["other.site"])
simple_config = %{"reject" => ["example.com", "other.site"]}
expected_config = %{"reject" => ["example.com"]}
Pleroma.Config.put(:mrf_simple, simple_config)
response =
conn
|> get("/nodeinfo/2.1.json")
|> json_response(:ok)
assert response["metadata"]["federation"]["mrf_simple"] == expected_config
assert response["metadata"]["federation"]["exclusions"] == true
Pleroma.Config.put([:instance, :rewrite_policy], config)
Pleroma.Config.put([:instance, :mrf_transparency], option)
Pleroma.Config.put([:instance, :mrf_transparency_exclusions], exclusions)
Pleroma.Config.put(:mrf_simple, %{})
end
end

View File

@ -11,8 +11,10 @@ defmodule Pleroma.Web.OStatusTest do
alias Pleroma.User
alias Pleroma.Web.OStatus
alias Pleroma.Web.XML
import Pleroma.Factory
import ExUnit.CaptureLog
import Mock
import Pleroma.Factory
setup_all do
Tesla.Mock.mock_global(fn env -> apply(HttpRequestMock, :request, [env]) end)
@ -196,7 +198,7 @@ test "handle incoming retweets - GS, subscription - local message" do
assert retweeted_activity.data["type"] == "Create"
assert retweeted_activity.data["actor"] == user.ap_id
assert retweeted_activity.local
assert retweeted_activity.data["object"]["announcement_count"] == 1
assert Object.normalize(retweeted_activity).data["announcement_count"] == 1
end
test "handle incoming retweets - Mastodon, salmon" do
@ -266,10 +268,13 @@ test "handle incoming favorites with locally available object - GS, websub" do
assert favorited_activity.local
end
test "handle incoming replies" do
test_with_mock "handle incoming replies, fetching replied-to activities if we don't have them",
OStatus,
[:passthrough],
[] do
incoming = File.read!("test/fixtures/incoming_note_activity_answer.xml")
{:ok, [activity]} = OStatus.handle_incoming(incoming)
object = Object.normalize(activity.data["object"])
object = Object.normalize(activity.data["object"], false)
assert activity.data["type"] == "Create"
assert object.data["type"] == "Note"
@ -282,6 +287,23 @@ test "handle incoming replies" do
assert object.data["id"] == "tag:gs.example.org:4040,2017-04-25:noticeId=55:objectType=note"
assert "https://www.w3.org/ns/activitystreams#Public" in activity.data["to"]
assert called(OStatus.fetch_activity_from_url(object.data["inReplyTo"], :_))
end
test_with_mock "handle incoming replies, not fetching replied-to activities beyond max_replies_depth",
OStatus,
[:passthrough],
[] do
incoming = File.read!("test/fixtures/incoming_note_activity_answer.xml")
with_mock Pleroma.Web.Federator,
allowed_incoming_reply_depth?: fn _ -> false end do
{:ok, [activity]} = OStatus.handle_incoming(incoming)
object = Object.normalize(activity.data["object"], false)
refute called(OStatus.fetch_activity_from_url(object.data["inReplyTo"], :_))
end
end
test "handle incoming follows" do
@ -302,6 +324,14 @@ test "handle incoming follows" do
assert User.following?(follower, followed)
end
test "refuse following over OStatus if the followed's account is locked" do
incoming = File.read!("test/fixtures/follow.xml")
_user = insert(:user, info: %{locked: true}, ap_id: "https://pawoo.net/users/pekorino")
{:ok, [{:error, "It's not possible to follow locked accounts over OStatus"}]} =
OStatus.handle_incoming(incoming)
end
test "handle incoming unfollows with existing follow" do
incoming_follow = File.read!("test/fixtures/follow.xml")
{:ok, [_activity]} = OStatus.handle_incoming(incoming_follow)
@ -401,7 +431,7 @@ test "find_or_make_user sets all the nessary input fields" do
}
end
test "find_make_or_update_user takes an author element and returns an updated user" do
test "find_make_or_update_actor takes an author element and returns an updated user" do
uri = "https://social.heldscal.la/user/23211"
{:ok, user} = OStatus.find_or_make_user(uri)
@ -414,14 +444,56 @@ test "find_make_or_update_user takes an author element and returns an updated us
doc = XML.parse_document(File.read!("test/fixtures/23211.atom"))
[author] = :xmerl_xpath.string('//author[1]', doc)
{:ok, user} = OStatus.find_make_or_update_user(author)
{:ok, user} = OStatus.find_make_or_update_actor(author)
assert user.avatar["type"] == "Image"
assert user.name == old_name
assert user.bio == old_bio
{:ok, user_again} = OStatus.find_make_or_update_user(author)
{:ok, user_again} = OStatus.find_make_or_update_actor(author)
assert user_again == user
end
test "find_or_make_user disallows protocol downgrade" do
user = insert(:user, %{local: true})
{:ok, user} = OStatus.find_or_make_user(user.ap_id)
assert User.ap_enabled?(user)
user =
insert(:user, %{
ap_id: "https://social.heldscal.la/user/23211",
info: %{ap_enabled: true},
local: false
})
assert User.ap_enabled?(user)
{:ok, user} = OStatus.find_or_make_user(user.ap_id)
assert User.ap_enabled?(user)
end
test "find_make_or_update_actor disallows protocol downgrade" do
user = insert(:user, %{local: true})
{:ok, user} = OStatus.find_or_make_user(user.ap_id)
assert User.ap_enabled?(user)
user =
insert(:user, %{
ap_id: "https://social.heldscal.la/user/23211",
info: %{ap_enabled: true},
local: false
})
assert User.ap_enabled?(user)
{:ok, user} = OStatus.find_or_make_user(user.ap_id)
assert User.ap_enabled?(user)
doc = XML.parse_document(File.read!("test/fixtures/23211.atom"))
[author] = :xmerl_xpath.string('//author[1]', doc)
{:error, :invalid_protocol} = OStatus.find_make_or_update_actor(author)
end
end
describe "gathering user info from a user id" do

View File

@ -6,6 +6,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilControllerTest do
alias Pleroma.User
alias Pleroma.Web.CommonAPI
import Pleroma.Factory
import Mock
setup do
Tesla.Mock.mock(fn env -> apply(HttpRequestMock, :request, [env]) end)
@ -227,10 +228,67 @@ test "show follow account page if the `acct` is a account link", %{conn: conn} d
end
end
test "GET /api/pleroma/healthcheck", %{conn: conn} do
conn = get(conn, "/api/pleroma/healthcheck")
describe "GET /api/pleroma/healthcheck" do
setup do
config_healthcheck = Pleroma.Config.get([:instance, :healthcheck])
assert conn.status in [200, 503]
on_exit(fn ->
Pleroma.Config.put([:instance, :healthcheck], config_healthcheck)
end)
:ok
end
test "returns 503 when healthcheck disabled", %{conn: conn} do
Pleroma.Config.put([:instance, :healthcheck], false)
response =
conn
|> get("/api/pleroma/healthcheck")
|> json_response(503)
assert response == %{}
end
test "returns 200 when healthcheck enabled and all ok", %{conn: conn} do
Pleroma.Config.put([:instance, :healthcheck], true)
with_mock Pleroma.Healthcheck,
system_info: fn -> %Pleroma.Healthcheck{healthy: true} end do
response =
conn
|> get("/api/pleroma/healthcheck")
|> json_response(200)
assert %{
"active" => _,
"healthy" => true,
"idle" => _,
"memory_used" => _,
"pool_size" => _
} = response
end
end
test "returns 503 when healthcheck enabled and health is false", %{conn: conn} do
Pleroma.Config.put([:instance, :healthcheck], true)
with_mock Pleroma.Healthcheck,
system_info: fn -> %Pleroma.Healthcheck{healthy: false} end do
response =
conn
|> get("/api/pleroma/healthcheck")
|> json_response(503)
assert %{
"active" => _,
"healthy" => false,
"idle" => _,
"memory_used" => _,
"pool_size" => _
} = response
end
end
end
describe "POST /api/pleroma/disable_account" do