Graphic intensive platforms, like Facebook, may have strong support for closed captioning, since it encourages engagement, however, not all social media platforms have a separate field option for alternative text. You have to enter the description directly into the post (or each specific image) if you are not linking to another webpage or accessible document. Example:
The Sun emitted a strong solar flare on May 14, 2024, peaking at 12:51 p.m. ET. NASA’s Solar Dynamics Observatory captured an image of the event, which was classified as X8.7. https://go.nasa.gov/4aipKPf
Solar flares are powerful bursts of radiation. Harmful radiation from a flare cannot pass through Earth’s atmosphere to physically affect humans on the ground. However — when intense enough — they can disturb the atmosphere in the layer where GPS & communications signals travel.
To see how such space weather may affect Earth, check out NOAA NWS Space Weather Prediction Center, the U.S. government’s official source for space weather forecasts, watches, warnings, and alerts. NASA
Facebook developers have been working on improving the automated generation of alternative text for posted images to aid blind users. It takes from a database of known objects, defined by people like us (similar to their face recognition database when we tag our friends) who just add alternative text to an image.
Since using alternative text is not as common a habit as tagging friends in photos, this database is young and still not a good alternative for content we post in any platform. For example, if the screen reader views an image in Facebook, it could say something like:
Image may contain: house, plant, tree, table and outdoor
Facebook recently added a way for you to enter your own alternative text to an image you uploaded to Facebook's desktop browser version.