What do Americans think of the US foreign policy?
As a group I think Americans are far too diverse in their views and opinions to identify any overarching position that Americans have on their foreign policy. If anything, I would say that Americans are generally dissatisfied but for different reasons. For example, some people think we should be more engaged, some think we should be less engaged. Some think we should work unilaterally while some think we should work bilaterally. Some favor a more robust military posture, some don't; the list goes on and on.
America is also perhaps a little unique in its foreign policy positions in that the President - who changes every four or eight years - has wide latitude to direct and change foreign policy. This makes any discussion of a cohesive or long-term "American" foreign policy a little difficult.
I think war is not justifiable ethically unless there is no other choice. I feel it is profitable however and that is why it seems popular with our politicians. I feel our country did not learn our lesson in Vietnam and we should adopt a more isolationist foreign policy instead of trying to police the world. Who are we to be policemen enforcing international law? War is nasty and is hell so if we do get absolutely forced to fight we do it more like the Russians and WIN decisively and swiftly in any manner possible.