I am running 0.17.3 version of Lemmy installed by docker-compose, in front of it I am running NPM (ngnix proxy manager) which handles all of my public facing docker images.
The local site seems to function except for one area, I can’t search for or properly federate other instances. I believe I have narrow the problem down to one particular function:
https://lemmy.timgilbert.be/c/[email protected]
This results in a response in the browser:
404: couldnt_find_community
And an error log in docker logs for the server:
2023-06-13T01:48:29.196432Z WARN Error encountered while processing the incoming HTTP request: lemmy_server::root_span_builder: couldnt_find_community: Failed to resolve actor for reddit@lemmy.ml
0: lemmy_apub::fetcher::resolve_actor_identifier
at crates/apub/src/fetcher/mod.rs:16
1: lemmy_apub::api::read_community::perform
with self=GetCommunity { id: None, name: Some("[email protected]"), auth: Some(Sensitive) }
at crates/apub/src/api/read_community.rs:30
2: lemmy_server::root_span_builder::HTTP request
with http.method=GET http.scheme="http" http.host=lemmy.timgilbert.be http.target=/api/v3/community otel.kind="server" request_id=d7e7d1e0-a03f-4e28-ad98-a4d6027b7a47 http.status_code=400 otel.status_code="OK"
at src/root_span_builder.rs:16
LemmyError { message: Some("couldnt_find_community"), inner: Failed to resolve actor for reddit@lemmy.ml, context: "SpanTrace" }
If I hit this same URL but from a working site:
https://lemmy.world/c/[email protected]
and it receives a response.
I also see other other errors/warning all of which indicates some problem with my settings for in-bound:
2023-06-13T01:54:08.818240Z WARN Error encountered while processing the incoming HTTP request: lemmy_server::root_span_builder: cant accept local object from remote instance
0: lemmy_apub::objects::comment::verify
at crates/apub/src/objects/comment.rs:135
1: lemmy_apub::activities::create_or_update::comment::verify
at crates/apub/src/activities/create_or_update/comment.rs:156
2: lemmy_apub::activities::community::announce::receive
at crates/apub/src/activities/community/announce.rs:149
3: lemmy_server::root_span_builder::HTTP request
with http.method=POST http.scheme="https" http.host=lemmy.timgilbert.be http.target=/inbox otel.kind="server" request_id=ddee706e-a793-409c-b6e7-eafe1fba646f http.status_code=400 otel.status_code="OK"
at src/root_span_builder.rs:16
LemmyError { message: None, inner: cant accept local object from remote instance, context: "SpanTrace" }
I suspect that it has to do with having two NGINX wrappers, one that is setup by Lemmy’s docker-compose and one for my NPM.
Anyone else have similar problems?
Does this happen with other instances? I think lemmy.ml is getting overloaded currently
I suspect you’ve manually configured NPM to point directly to the lemmy-ui container. This will break stuff. The ansible provided nginx config routes different URIs to lemmy and lemmy-ui.
You might be on to something. I have a single NPM host which is pointing to the UI, but I do see that in the ngnix config configuring the two locations but of course I’ve only mapped one. I’ll study this a bit more, thanks for the tip.
I had a similar problem but just used nginx as the proxy from their installation guide.
What I found out is that the lemmy container needed to be exposed to an external network, but from the docker-compose file it was limited to an internal network only (named lemmyinternal). I ended up adding the lemmyexternalproxy network to the lemmy container as well and that fixed my issue.
Try other instances, lemmy.ml is down right now. It’s been going up and down all day. And when it’s not down, federation is hours behind if not completely broken.
You can see it on lemmy.world because they were already federated, and you’re seeing the cache.
Oh, i see. Thanks for the reply; i guess I have a lot to learn still.
You’re speaking to us here on lemmy.world, so I’m pretty sure you’ve got everything set up correctly!
pretty neat! From my perspective its local. Kind of amazing.
I have a new indiedroid Nova I am going to try it out on and see how it does. It’s a 8GB model with 32GB EMMC, and thats all it will be running for now since it was a impulse buy.