What do you think nursing is? What has most influenced how you see nursing?
What do you think nursing is? What has most influenced how you see nursing? How do you react to shows that have nurses on them? How do you think this can affect the view of nursing that the general public will have? Have you had family or friends question why you would want to become a nurse? Have you had someone tell you to become a doctor instead? What can we as nurses and nursing students do to positively or negatively affect the image of nursing? This is more of a “personal” discussion and I do not expect you to bring in resources, unless you want to. If you have not already done so, please review the rubric for discussion posts-they need to be “substantial”- have depth and your answers need to be substantial and carry forward the conversation.