Suicide bombers and military draftees aside, are there any cultures that teach women to be just as violent as men? Meaning they’re not brought up to be what the West might call “lady-like”, but physically and emotionally willing to injure other human beings for self-preservation, to settle altercations, or whenever men in that society would be expected to be violent?
For example, a culture that asks women who are mugged “Why didn’t you shoot the guy?”, or when they get bullied, “Did you kick her ass?”
For the purposes of this thread, I’d like to differentiate between aggression and violence, and focus specifically on societies that indoctrinate/raise their women to be physically tough and willing to use it.