Because of the influx of Christian movies being made in Hollywood lately, many are asking whether the Bible films and TV shows could help Christianity not just make a comeback and regain its lost prominence in American culture, but take the gospel message to other cultures around the globe. Recently, it’s seemed as though the …
Continue reading “Is Christianity Making A Comeback In Hollywood?”