A group of people that tend to generally believe in the same tenets of life as according to a self-proclaimed sacred text is called a religion. Religions have been the root cause of mankind's downfall...since the times of Moses.
Now, on the other hand there is something called Faith. To have complete belief and trust in something is what this means. It also means that a person strives to live by the ideals of their belief with confidence that they will reap some manner of benefit upon their passing.
Theses terms often get swapped around inappropriately. Religions are usually the main culprit in this, because corporate religion has everything to gain by trying to control what Faith is. But you know what, almost across the board, the 'religious' texts teach about the same basic things when it comes to how We as people are supposed to live. . .
Treating each other with decency and respect.
Punishment is for 'God' to meet out.
Take care of the Earth that 'God' gave Us.
Just about everything after that becomes religion. Religion is something that has been used as a tool for War and Peace, but mainly War. Labeling, Exclusion, Physical harm and Death are all acceptable forms of dealing with the transgression of people in Judaism, Christianity and Islam. These ideals have even been used to script and codify the laws of Nations and Declarations of War that have led to where Human history now stands.
What has been the result of this 2000 years or so of organized corporate religions? it doesn't take long to figure out that Man has made a mess of Itself. Despite all the wonders of the World, Intelligent Minds and Guidance of 'God', We have wrought Moral Decay, Constant Wars, Pandemic Health Issues and Mass Pollution across the entire Creation.
What can We do to fix this that We have done to Ourselves?