Executives at the social media giant found themselves the target of a complaint filed in 2016 that alleged they were not taking sufficient action to clamp down on racist content.
Concern has risen in Germany over the vitriolic comments made by some Facebook and Twitter users, which gained intensity as public misgivings grew in some quarters over the arrival of more than a million asylum seekers since 2015.
Lawyer Chan-jo Jun, who initiated the claim, had compiled 442 Facebook posts containing incitement of hatred and violence, as well as support for terrorist groups.
But he said the social network failed to delete them, even though they were repeatedly flagged up as offensive speech.
Despite the list of posts, the prosecutor said: “Failing to delete illegal posts on the internet platform in a timely way is not a basis for suspicion of criminal behaviour by executives at Facebook.”
Germany has in recent months moved to clamp down against online hate speech.
A new law that came into force on January 1st requires social media giants to remove hate speech and other illegal content, or risk fines of up to €50 million.
Under the legislation, companies like Twitter and Facebook would have 24 hours to remove posts that openly violate German law after they are flagged by users.
Fierce protests by free-speech advocates against the law — known as NetzDG — prompted the government to announce in early January it was already considering modifications.