Business

Is Technology Development Shaped by Men’s Needs?

Is men’s research prioritised in the development of new technologies? – London Business School

When Apple first launched its health‑tracking app, it could monitor everything from steps to sodium intake-but not menstruation. The oversight sparked outrage and a broader question that refuses to go away: whose bodies, lives and needs are embedded in the technologies that shape our world?

From crash test dummies based on the “average” male body to voice-recognition systems that struggle with higher-pitched voices, critics argue that innovation has long been designed around men by default. As digital tools, AI systems and medical devices increasingly mediate everything from our finances to our fertility, the stakes of that bias grow sharper.At London Business School, researchers are probing a fundamental issue at the heart of this debate: is men’s research still being prioritised in the progress of new technologies-and what does that mean for performance, safety and inclusion? This article examines the evidence, the economic incentives and the hidden assumptions that continue to shape the tech we all rely on.

Unequal data the gender gap embedded in tech innovation

From crash test dummies modelled on the ‘average’ male body to voice-recognition systems that falter with higher-pitched tones, the data that powers modern technology frequently enough skews male by design. This isn’t always the result of explicit bias; more often it stems from who is counted, who is studied and who is considered the “default user” in research trials and product testing. When women, non-binary people and gender-diverse users are underrepresented in datasets, technologies built on those datasets can systematically underperform for them. The result is an innovation pipeline where risk, reliability and even safety are optimised around a narrow slice of the population.

  • Clinical wearables trained on male physiology misread symptoms in women
  • AI recruitment tools amplify biases buried in historic, male-dominated CVs
  • Finance and credit algorithms penalise career breaks and part-time work patterns
  • Mobility and transport apps overlook safety needs and caregiving travel routes
Tech Area Hidden Gender Risk What Gets Missed
Health AI Male-centric training data Female symptom patterns
Workplace tools ‘Always-on’ user norm Flexible and unpaid care roles
Smart cities Commute-focused design Multi-stop, caregiving journeys

As London’s fintech firms, AI start-ups and global tech players race to build the next generation of digital infrastructure, the real competitive edge may lie in who their data represents. Organisations that interrogate their datasets for demographic blind spots, invest in gender-disaggregated metrics and bring more women into research design teams are not indulging in optics; they are improving product accuracy and market fit. In a landscape where regulation around algorithmic fairness is tightening, failing to correct gendered data gaps is no longer just a diversity issue – it is a strategic and commercial liability.

Behind the lab door how funding and research design favour men

In the quiet bureaucracy of grant applications and ethics approvals, gender bias often hides in plain sight. Funding calls are frequently framed around conditions, performance metrics and risk factors that skew male: cardiovascular disease over autoimmune disorders, elite sport over unpaid care work, crash-test impacts over chronic pain from repetitive tasks.As evidence of market size is a key criterion, projects that center women – whose symptoms are historically under-documented and under-diagnosed – appear less “data rich” and thus less commercially attractive. The result is a feedback loop in which researchers design studies around male bodies to meet funding expectations, and funders then point to those studies as proof of where the “real” demand lies.

  • Eligibility rules that privilege established (frequently enough male-led) labs
  • Clinical trial norms that default to male participants as the “standard” human
  • Outcome measures calibrated to male physiology and work patterns
  • Risk models that downplay pregnancy, menopause and caregiving realities
Design Choice Favours Overlooks
Average male body as “baseline” Drug dosing, safety thresholds for men Different reactions in women
9-5 trial schedules Participants with stable office hours Shift workers, carers, part-time staff
Cardio and strength endpoints Peak performance and speed Fatigue, pain, hormonal effects

Once baked into protocols, these choices shape everything from who is recruited to what counts as a “prosperous” outcome.Study inclusion criteria may quietly filter out those who are pregnant, peri-menopausal or juggling fragmented working hours – categories that disproportionately include women – because they introduce “too much variability” into results. Devices are then calibrated on this narrowed sample, locking male-biased assumptions into the hardware and software of emerging technologies. The paradox is stark: in the name of scientific control and commercial efficiency, research design systematically smooths away the complexity of women’s lives and bodies, and then declares the resulting technologies universally applicable.

From boardrooms to beta tests who decides which bodies technology serves

In the rush to innovate, the quiet power lies with those who set the agenda and those who test the prototypes. Investment committees, product leads and policy teams often operate with a default male user in mind – not out of malice, but habit. When datasets are trained on predominantly male bodies,when lab trials exclude pregnant people,and when venture capital panels skew male,the “average user” becomes a narrow archetype.This is where design decisions about everything from crash-test dummies to wearables’ heart-rate algorithms are made, shaping whose safety, comfort and health are optimised. The result is a pipeline in which men’s needs are treated as baseline requirements, while women’s are framed as niche, optional or “later-phase” improvements.

By the time a product reaches beta testing, the bias is already embedded, but it often deepens when feedback loops lack diversity. Closed test groups of young, urban male early adopters can mask failure points that disproportionately affect women, older users or people with disabilities.To shift this, organisations are beginning to formalise inclusion at every decision node:

  • Funding briefs that explicitly require gender-balanced user impact.
  • Research protocols that mandate sex-disaggregated data and analysis.
  • Beta cohorts that mirror real-world diversity, not just tech-savvy volunteers.
  • Governance boards with clear accountability for equity in product outcomes.
Decision Stage Typical Bias Risk Equity Check
Funding & scoping Male default use-cases Gender impact in business case
Research & data Male-skewed samples Sex-disaggregated metrics
Prototyping One-size-fits-men design Body-size and hormone-aware specs
Testing & launch Homogeneous testers Diverse beta panels and audits

Resetting the pipeline practical steps for gender inclusive technology development

Rebalancing innovation starts long before a product reaches the lab, beginning with who is invited to define the problem. Organisations can embed gender-aware governance by requiring diverse review panels at every stage of funding, design and testing, and by mandating that research proposals specify how sex and gender differences will be addressed or why they are not relevant. Product teams can shift away from the default male user persona by drawing on intersectional data,participatory workshops and ethnographic research that deliberately includes women and non-binary people across age,income and geography. Procurement guidelines, too, can be rewritten so that external vendors must evidence inclusive datasets, mixed-gender leadership and clear plans to mitigate algorithmic bias.

  • Audit existing projects for gender blind spots and data gaps.
  • Redesign user-testing protocols to ensure equitable representation.
  • Reward teams that demonstrate measurable inclusion outcomes.
  • Report on gender impact alongside financial and technical KPIs.
Pipeline Stage Inclusive Action Swift Metric
Idea generation Co-create problem statements with diverse user groups % women/non-binary in workshops
Data collection Set minimum thresholds for sex-disaggregated datasets Share of records tagged by sex/gender
Prototype testing Run trials in varied real-world environments Test cohorts by gender and context
Launch & scale Publish gender impact notes with product releases Inclusion KPIs in launch reports

Embedding these measures into performance reviews and investment criteria moves gender inclusion from a “nice to have” to a non-negotiable standard. Leadership can link bonuses and promotion pathways to inclusive design milestones, while accelerators and corporate venture arms can prioritise founders who build with diverse teams and user panels from day one. Over time, this rewired pipeline changes whose pain points are seen as commercially valuable, ensuring that new technologies are not merely technically advanced but also socially attuned to the full spectrum of human experience.

In Summary

As the pace of innovation accelerates, the question is no longer whether gender bias exists in technology, but how far we are prepared to go to dismantle it. The history of medical trials dominated by male subjects, crash tests designed around the “average” male body, and AI tools trained on skewed datasets is not just a record of oversight – it is a reminder that who gets counted shapes who gets protected, served and empowered.

What emerges from London Business School’s examination is not a simple indictment of “men’s research” but a call to reframe what counts as mainstream. Inclusive design, representative data and diverse research agendas are not add-ons; they are prerequisites for technologies that work fairly and effectively for everyone.As regulators, investors and consumers become more attuned to these gaps, the competitive advantage will lie with organisations that recognize bias as a design flaw, not an inevitability. Whether men’s needs continue to be hardwired into the foundations of new technologies, or whether we build systems that reflect the full spectrum of human experience, will depend on decisions being made in labs, boardrooms and classrooms today.

Related posts

How the UAE’s Policies Drive Stability in a Turbulent World

Olivia Williams

UK Wage Growth Slows Down, Raising Red Flags for Investors

William Green

Inside the London Summit 2026: Shaping the Future of Business

Mia Garcia