Nursing is a profession focused on the care of individuals, families, and communities to promote, maintain, or restore health and well-being. Nurses work in various settings, providing direct patient care, advocating for patients' rights, educating patients and the public about health issues, and collaborating with other healthcare professionals to ensure comprehensive care.