Winter tires are for winter, lots of people still have this old mindset that you only needed proper tires when it snows.
Insurance companies are catching on. Most offer discounted rates when your vehicle has winter tires.
I think the discount is actually mandated by law