Formal math notation of masked vector
I'm struggling to write my algorithm in a concise and correct way. The following is an explanation for an optimizer's update step of part of a vector of weights (not a matrix in my case).
I have a vector $\alpha \in \mathbb{R}^d$, and a set $S$ that includes some indices $1\leq i \leq d$ ($S \subseteq \{1,\dots, d\}$). Now, I want to denote that $\alpha$ is 0 for every index $i\in S$, and otherwise it's the value as in $\alpha_i$. At first I denoted it $\alpha_S$, but I'm not sure it is properly defined or understandable.
I could use the following notation:
$\alpha_S = \begin{cases} \alpha_j j \in S\\ 0 j \notin S \end{cases}$
But its line height is twice the size, and I want to avoid that.
Is there any other formal, simplistic way to notate this correctly? Maybe some kind of a masking vector to be multiplied with $\alpha$?
Thanks!
Topic mathematics notation optimization
Category Data Science