Dear colleagues,
We are delighted to announce SemEval-2026 Task 3 Track B: Dimensional Stance Analysis
*Aspect-Based Sentiment Analysis (ABSA)* is a widely used technique for analyzing people’s opinions and sentiments at the aspect level. However, current ABSA research predominantly adopts a coarse-grained, categorical sentiment representation (e.g., positive, negative, or neutral). This approach stands in contrast to long-established theories in psychology and affective science, where sentiment is represented along fine-grained, real-valued dimensions of valence (ranging from negative to positive) and arousal (from sluggish to excited). This valence-arousal (VA) representation has inspired the rise of dimensional sentiment analysis as an emerging research paradigm, enabling more nuanced distinctions in emotional expression and supporting a broader range of applications.
Given an utterance or post and a target entity, stance detection involves determining whether the speaker is in favor or against the target. *This track reformulates stance detection as a Stance-as-DimABSA task with the following transformations:*
*1. The stance target is treated as an aspect.2. Discrete stance labels are replaced with continuous VA scores.*
Building on this, we introduce *Dimensional Stance Analysis (DimStance)*, a Stance-as-DimABSA task that reformulates stance detection under the ABSA schema in the VA space. This new formulation extends ABSA beyond consumer reviews to public-issue discourse (i.e., politics and environmental protection) and also generalizes stance analysis from categorical labels to continuous VA scores. Given a text and one or more aspects (targets), predict a real-valued valence-arousal (VA) score for each aspect, reflecting the stance expressed by the speaker toward it.
——————— *Languages* ——————— *We provide data in 5 languages*, including: German (deu), English (eng), Hausa (hau), Swahili (swa), and Chinese (zho)
——————— *Evaluation* ——————— RMSE is used.
——————— *Participation* ——————— *Website* (checkout details): https://github.com/DimABSA/DimABSA2026
*Codabench* (register and submit results) - Track B: https://www.codabench.org/competitions/11139/
*Discord* (community and discussion) https://discord.gg/xWXDWtkMzu
*Google Group* (official updates): https://groups.google.com/g/dimabsa-participants
——————— *Important Dates * ——————— - Sample Data Ready: 15 July 2025 - Training Data Ready: 30 September 2025 - Evaluation Start: 12 January 2026 - Evaluation End: 30 January 2026 - System Description Paper Due: February 2026 - Notification to Authors: March 2026 - Camera Ready Due: April 2026 - SemEval Workshop 2026: co-located with ACL 2026 (San Diego, CA, USA)
We warmly invite the community to participate in this exciting shared task and contribute to advancing NLP research.
Best regards, SemEval-2026 Task 3 Organizers