"Transformer models like BERT and GPT-2 are domain agnostic, meaning that they can be directly applied to 1-D sequences of any form. When we train GPT-2 on images unrolled into long sequences of pixels, which we call iGPT, we find that the model appears to understand 2-D image characteristics such as object appearance and category."
I don't see why big banks aren't already using similar AI to analyse patterns in the stock market. To me an active investor trying to play the market sounds like a chess grandmaster trying to beat a chess engine.