The transfer is always to the same number so I don't have to worry about multiple numbers. There are different reasons for the transfer but I consistently tested the same scenario over and over again and it always fails and then I immediately test a different scenario and it succeeds then I change back and it fails again always the similar scenarios.
Then I changed the llm model and it worked much much better. I changed from open AI chat GPT 40 mini, to Llama 3.1 70b on groc, And it worked so much better. Sometimes it transferred a little bit too much. I will test other models and I will test very slight variations in the prompt and many different scenarios. When I thoroughly understand what is going on or think I understand, I will post the solution here.