Techfullnews

You can’t just leave Threads in the Following feed

threads

Meta’s Threads app introduces a brand-new Following feed, offering users an alternative way to interact with the platform. However, the following feed doesn’t always remain visible, as the app defaults back to the algorithmic ‘For You’ feed periodically.

Threads in the following feed

To access the Following feed, users can tap the home icon at the bottom of the screen or the Threads logo at the top. This feed displays posts in reverse-chronological order, allowing users to keep up with the latest updates. Conversely, the ‘For You’ feed, arranged algorithmically, has been the app’s default offering since its launch.

The Verge noticed that Threads occasionally hides the Following feed and reverts users to the For You feed upon opening the app. Meta has confirmed that the For You feed is designed to be the default experience. While users can switch to the Following feed, they have no option to set it as the default.

This decision has drawn criticism, especially from users seeking to keep up with the latest news regularly. However, Meta’s statement aligns with Instagram’s approach, where the Following feed is also deprioritized and hidden behind a nondescript arrow next to the Instagram logo.

Interestingly, this behavior echoes Elon Musk’s Twitter experiment, where the “For You” feed was set as the default in January. However, the decision was quickly reversed after user backlash.

While Meta has not indicated any immediate plans to make the Following feed the default option permanently, users may continue to hope for a more user-friendly and customizable Threads experience in the future.

ADVERTISEMENT
RECOMMENDED
NEXT UP

At its Meta Connect event today, CEO Mark Zuckerberg announced two new AI-powered features: generative AI stickers and AI editing tools.

Generative AI stickers

Generative AI stickers are a new way to express yourself in Meta’s messaging apps, including WhatsApp, Messenger, Instagram, and Facebook Stories. With generative AI stickers, you can create unique stickers by simply typing in a text prompt. For example, you could type “Hungarian sheep dog driving a 4×4” and Emu, Meta’s new foundational model for image generation, would generate a sticker that matches your prompt.

Generative AI stickers are currently in beta testing and will be available to English-language users over the next month.

AI editing tools

AI editing tools are a new way to edit your photos and videos using AI. With AI editing tools, you can change the style of your photos and videos, remove objects from the background, and even create new images and videos from scratch.

Meta demonstrated two new AI editing tools at Meta Connect: Restyle and Backdrop.

Restyle lets you reimagine the visual styles of an image by typing in prompts like “watercolor” or “collage from magazines and newspapers, torn edges.” Backdrop lets you change the scene or background of your image by using prompts.

AI editing tools will be available soon on Instagram.

Meta’s commitment to responsible AI

Meta has pledged to develop AI responsibly and ethically. The company says that it will indicate the use of AI in its images “to reduce the chances of people mistaking them for human-generated content.” Meta is also experimenting with forms of visible and invisible markers to help people identify AI-generated content.

Meta’s generative AI stickers and AI editing tools are exciting new features that have the potential to revolutionize the way we communicate and express ourselves. With these tools, we can now create unique and personalized content that is tailored to our specific needs and interests.

X, formerly known as Twitter, has expanded its crowdsourced fact-checking system, Community Notes, to support video content. This means that approved members can now attach written notes to videos that contain misleading information. The notes will be visible to all users, providing them with additional context about the content they are viewing.

The expansion of Community Notes to video content is a welcome move, as it could help to reduce the spread of misinformation on the platform. However, some experts have questioned the effectiveness of the system, due to a few fundamental flaws.

One flaw is that, for a Community Note to become visible, it must first be approved by members on both sides of the discourse. This can create a situation where harmful or misleading content can go unchecked for a while before it gets tagged with the proper disclaimer, if at all.

Another flaw is X’s disproportionate implementation of its moderation, safety, and security features. For example, X has been repeatedly called out for censoring critical voices targeting the government in markets like India and the Middle East. This raises concerns about whether Community Notes will be applied fairly and consistently across all types of content.

Finally, some experts have criticized the fact that X is indirectly passing the onus of fact-checking to its most prolific users, rather than having a dedicated trust and safety team do the job. This is especially concerning given that Elon Musk famously gutted the company’s safety team soon after he took over.

Overall, the expansion of Community Notes to video content is a positive step, but it is important to be aware of its limitations. The system is still under development, and it remains to be seen how effective it will be in reducing the spread of misinformation on the platform.

In addition to the flaws mentioned above, here are some other concerns about the Community Notes system:

  • It is unclear how X will ensure that Community Notes are accurate and unbiased.
  • The system could be used to harass or silence certain users.
  • It could be used to spread misinformation itself, if it is not carefully monitored.

Overall, the Community Notes system is a promising experiment, but it is important to be aware of its limitations and potential risks. X should carefully monitor the system and make adjustments as needed to ensure that it is effective and fair.

ADVERTISEMENT
Receive the latest news

Subscribe To Our Weekly Newsletter

Get notified about new articles