In many countries, it is becoming mandatory to receive vaccines in order to prevent the spread of disease.
Do you think this is a positive or negative development?
There have been calming that Covid-19 vaccines have not been tested for their long-term effects, so people should not get vaccinated
To what extent do you agree or disagree with this statement?
Many childhood diseases can now be prevented with vaccines. Should parents be made by law to immunize their children against common diseases or should individuals have the right to choose not to immunize their children? Discuss both views and give your opinion