Data-driven deaths: How Israel's AI war machine pinpoints Palestinian victims


  • English
  • Arabic

Ali lived in what had been a relatively untouched neighbourhood in eastern Gaza city until the night of October 12, 2024, when from nowhere an Israeli bomb struck. Shaken but unhurt, the IT technician fled with his laptop to the Sheikh Radwan neighbourhood in the north of the city to live with his aunt.

Two weeks later shortly before midnight, the Palestinian was on his laptop working on the rooftop in search of a stronger signal to upload files via a VPN when he heard a drone circling overhead.

“It was closer than usual. Then seconds later I saw a red light coming down on to the rooftop right in front of me, no more than 20 metres away,” he says.

The blast threw him off his chair but he was largely unscathed. As he ran back down the stairs his aunt’s family shouted: “The strike was for you! What were you doing? Who were you communicating with? Who do you have connections with?” Ali’s uncle told him to pack his bags and leave.

He rushed to an IT expert friend’s house who suggested Ali’s activities had been analysed by artificial intelligence and he was flagged on the suspicious list “for my ‘unusual behaviour’ of working with international companies, using encryption programmes and spending long hours online”.

Amid the devastation of Gaza, Ali and many others believe there is an unseen, pervasive AI presence that is watching, listening and waiting for those on its target list to show their faces.

Israeli special combat soldiers conduct a training exercise using virtual reality battlefield technology in 2017. Getty Images
Israeli special combat soldiers conduct a training exercise using virtual reality battlefield technology in 2017. Getty Images

To survive, Ali now accesses the internet under strict security measures and in very short bursts. “Their AI systems see me as a potential threat and a target,” he says, sharing the fears of many trapped Palestinians that a machine is now determining their fate.

In another such tale, less than three minutes after two young men had entered the first floor of an apartment block, a bomb struck killing not only the pair but also Mohsen Obeid’s mother, father and three sisters.

Mr Obeid, 34, was devastated and baffled. His family had no links to Hamas or any other faction. “We were innocent civilians,” he later told The National.

It was only after the attack in May last year that a consoling neighbour told him that he had seen the two young men, “presumably from the resistance”, entering the house.

The Obeid family were in their second-floor flat in Al Faluja, north of Gaza city, completely unaware that Israel's state-of-the-art AI system had almost certainly used its immense data harvesting tools that gave the men a high “suspicion score”.

In an investigation into these Gaza deaths, The National found:

  • Israel operates a 20-second decision review known as a TCT (time constrained target) once a potential victim is picked up by the AI. These strikes are conducted on known Hamas operatives but also involve civilians
  • Israel’s state-of-the-art AI system uses data harvesting to give Gazans a high “suspicion score” that sets up a battlefield hit
  • Israel operates a series of AI systems that are routinely run with a level as low as 80 per cent confidence of confirming a legitimate target
  • Its known AI systems are Lavender, which raids data banks to generate potential confirmation of the target as a percentage; Gospel, which identifies static military targets such as tunnels; and Where’s Daddy?, which computes when a person is in a certain place
  • The target acquisition relies on facial recognition and other tools, including mapping a person’s gait and cross-checking identities
  • The target set includes as many as 37,000 Palestinians – compete with their photographs, videos, phone numbers and social media data – profiled in the systems.

Hoovering data

That, as much as anything else, sealed their fate. The system code-named Lavender had mined data collected over many years, said Chris Daniels, an AI specialist at Flare Bright, a machine-learning company.

“Gaza is a hugely surveilled place and Israel has taken that imagery, for however many years recording it all and feeding it into the system. They are just hoovering up all that data, visual facial recognition, phones, messages, internet, social media and because you've got two million people that is a huge amount for a machine-learning programme.”

Tal Hagin, an open-source intelligence analyst and software specialist based in Israel, believes there is an inflection point coming, if not already reached, where AI will make most battlefield decisions.

An Israeli drone flies over Rafah in the southern Gaza Strip. AFP
An Israeli drone flies over Rafah in the southern Gaza Strip. AFP

“The question is, are we at a point already where AI is taking over command, making decisions on the battlefield or are we still in the era of AI simply being an assistant, which is a huge difference.”

Certainly, in the early stages of the current Gaza conflict, the Israeli military was “eliminating different targets at a very, very increased speed” that would have required machine-generated information.

David Kovar, a cyber-AI specialist who has contracts with the US Department of Defence, said Israel had developed an enormous amount of their targeting information with AI “but they really weren't putting humans in the loop to validate whether these targets were legitimate”.

Human input

The strikes on Ali and the Obeid family out of the many thousands that have taken place in Gaza over the past 21 months raise serious concerns over the machine-driven mission Israel is carrying out. Does AI select the right people? Do humans have enough input? Are there any controls?

What The National’s investigation has found are not only questions over the accuracy of Lavender AI’s decision-making but also many civilians, like Mr Obeid’s family, who have apparently been killed by the system and that a high-collateral death toll is accepted from AI information.

Israeli forces are using high-tech weapons in Gaza. Getty Images
Israeli forces are using high-tech weapons in Gaza. Getty Images

The Israeli army, Mr Obeid said, “killed my family based on information generated by artificial intelligence” and without verifying if others were present in the building.

At the other end of the attack are Israeli military's AI "target officers" who are apparently content to go with an 80 per cent probability to confirm a target for strike, despite the collateral consequences, said Mr Daniels, also a former British army officer with strong Israeli military and intelligence connections.

“When it's 83.5 per cent and the human in the loop goes, ‘yes’ either that’s a good enough number – or if there's a low-value target, the number might be 95 per cent – but for a high-value target it could be as low as 60 per cent.”

The “tolerance for errors” within the Israeli command room, he has been told, was “immensely high”. “There's an element of dangerous errors pretty quickly if you remove the human,” he added.

The National spoke to Olivia Flasch, a lawyer who has advised the UN on the laws of armed conflict. She said: “It's prohibited to launch an attack that's expected to cause injury to civilians, that's excessive in relation to the concrete military advantage that is anticipated."

If a commander was 80 per cent sure that the target was the “mastermind of a terrorist organisation” and with him dead the war was likely to end, “that's a high military advantage”, she said. The assessment did not apply to rank-and-file fighters.

Sadly, for Mr Obeid and many others in Gaza, it appears the system named after the garden herb is now more redolent of death.

Killer systems

Lavender has been fed a vast mine of data including images taken from covert surveillance, open-source intelligence and information from the justice system of Palestinians that Israeli intelligence determined belonged to Hamas or other groups in Gaza.

The Lavender system does not necessarily generate targets but instead processes information that is generated and displayed for an intelligence officer. It is understood this then travels up a chain of command to a higher-ranking intelligence officer who will take into account civilian casualties when authorising a strike mission.

The National is also aware that there are other AI systems used by Israel whose codenames have not yet been disclosed – it is unclear what their capabilities are, such as in terms of precision targeting.

Insiders worry that Lavender has spawned a form of warfare where the human touch is largely absent at vital points.

An Israeli drone pilot beside a Hermes 900 unmanned aerial vehicle at Palmachim Airbase. Getty Images
An Israeli drone pilot beside a Hermes 900 unmanned aerial vehicle at Palmachim Airbase. Getty Images

Mr Kovar's information suggested that if a person was spotted above ground, moving between buildings, and AI had 80 per cent confidence this was a legitimate target, “they're going to take that shot”, he said, despite the risk of “collateral damage”.

The National has spoken to security sources, experts and viewed open-source intelligence piecing together how the system works from acquiring a target to their “elimination”.

When a person with a high “suspicion score” has their face recognised and location identified by AI, machine-driven analysis goes to work. This will include studying the person’s gait, their location and, using Gospel, their expected destination, alongside a wealth of other data processed within seconds.

Mr Kovar said considerable effort had been "put into human facial and gait recognition, how people walk and move, and where they’re going”.

The system also significantly speeds up the ability make observe, orientate, decide and act (Ooda) loop decisions, allowing for a rapid military response.

“If AI can get you through that Ooda loop, from an image to identifying who the human is, then saying, ‘OK, we're going to take the shot faster than a human can do it’ particularly [if] a human has to go to check with higher-ups, then they're going to use the AI," he said.

The assembled Lavender information goes to an operator, giving the potential confirmation of the target as a percentage (for example, 83.5 per cent) alongside their suspicion score, suggesting how senior a figure might be.

The AI system also significantly increases the speed of response to hitting a target without having to wait for authorisation from senior officers.

Drone footage of Hamas leader Yahya Sinwar moments before he was killed. AFP
Drone footage of Hamas leader Yahya Sinwar moments before he was killed. AFP

The Lavender system was first disclosed by the Israeli outlet +972 in April last year, with Israeli sources claiming that operators were permitted to kill 15 or 20 civilians to eliminate even low-ranking Hamas members.

The Gospel AI system first appeared on Israeli armed forces' websites in 2021, describing an algorithm-based tool that identifies static military targets such as tunnels or fighters’ homes, that can assist a rapid response if a suspect enters them. This has then been aligned to another system called Where’s Daddy? that can compute when a person is in a certain place.

“Where’s Daddy? is used to track individuals that have been targeted by Lavender and it strikes individuals once they've entered their homes,” said AI specialist Nilza Amaral, of the Chatham House think tank. This might explain why most of Israel's attacks on AI-identified targets take place on buildings.

Key champion

Outlandish as that might seem, Brig Yossi Sariel, head of Israel’s Unit 8200, the specialist team that introduced Lavender, wrote a book called The Human-Machine Team: How to Create Synergy Between Human & Artificial Intelligence That Will Revolutionise Our World.

In it, he describes a “target machine” which processes people’s connections and movements via social media, mobile phone tracking and household addresses.

Brig Sariel is allegedly a key driver behind the use of AI and, while the system’s precise workings remain highly secretive, it is understood Lavender generates a numerical “suspicion score” that, if high enough, will lead to a target for elimination.

Generating that high suspicion score makes death in Gaza a near inevitability, whether you were a member of Hamas or Palestinian Islamic Jihad or neither.

The strikes on the fighters, particularly in the early months of the campaign, were incessant, contributing to the body count that now stands at more than 57,570, with up to 20,000 of those combatants.

A Palestinian girl looks up at military drones circling Rafah refugee camp. AFP
A Palestinian girl looks up at military drones circling Rafah refugee camp. AFP

Iran nuclear origins

The evolution of this terrifyingly efficient killing machine that Israel has created will affect future wars and can be traced back to the Iranian nuclear scientists’ assassination programme that, before Israel’s air strikes last month, culminated in the killing of Prof Mohsen Fakhrizadeh in 2020.

After that remote attack, Israel knew they could successfully target someone in distant Iran, so why not on their own doorstep? The success of the facial recognition in Iran drove a new advance in warfare spawning the Lavender system.

Iranian scientist Mohsen Fakhrizadeh was killed in an attack on his car in 2020. Wikimedia Commons
Iranian scientist Mohsen Fakhrizadeh was killed in an attack on his car in 2020. Wikimedia Commons

Currency exchange killing

Ramy’s family has two shops, one each in Gaza city and Rafah, a currency exchange business that they have run for 50 years. So it was a shock when in early December 2023 their building in Gaza city took a direct hit from a drone-fired missile that fortunately failed to explode.

Confused, they had no idea why they were targeted because they believed the Israeli military would strike only if they had a specific reason.

Two weeks later, the branch reopened but within another two days it was struck again and this time the missile detonated, killing Ramy's brother Mohammed, two employees and several bystanders.

They suspended in-person services but then in April 2024 one of their data entry assistants was killed near his home by an Israeli bomb. It later transpired that the employee, who had no role in money transfers, had been affiliated to Hamas for some time.

With the two attacks on their business, Ramy was certain the Gospel and Lavender had identified and tracked the employee to their business premises. “But the artificial intelligence didn't take into account the presence of dozens of civilian casualties that would result from targeting him in a commercial location,” he said.

“My brother died in that strike, even though he had absolutely no connection to any faction,” he added. “He was martyred simply because he happened to be next to someone, who wasn’t some high-ranking [Hamas] figure – just a regular guy with a political affiliation.”

Instances such as this raise doubts over trusting AI’s judgment on who precisely is "the enemy”. Noah Sylvia, a research analyst for emerging military technology at the Royal United Services Institute, concurs.

The Israeli military insists that human analysts verify every target, yet he raises the serious issue that “we don't know whether or not the [AI] models are creating the targets themselves”.

Ms Amaral agrees. “There is no requirement for checking how the machine is making those decisions to select targets,” she said. “Because it seems there are many, many people who aren't involved in military operations that have been killed”.

As many as 37,000 Palestinians – compete with their photographs, videos, telephone and social media data – have reportedly had their data entered on to the Lavender system.

“The Israelis created as many targets as they could and put them in a bank that would have tens of thousands of targets, because they were always expecting the next war with Hamas,” said Mr Sylvia.

Damaged buildings and ruins in northern Gaza, as seen from the Israeli side of the border. Reuters
Damaged buildings and ruins in northern Gaza, as seen from the Israeli side of the border. Reuters

Fusion warfare

Among the new technology introduced to Gaza is an upgraded tank, the Merkava 5 Barak, which was fitted with AI, sensors, radar and small cameras before deployment.

Inside the Barak are touch screens to input information that allows soldiers to rapidly transfer data to the AI “target bank” that is fed to an operations room at a secret location. “These tanks are big-sensor platforms sucking in all the data,” said AI analyst Mr Daniels.

In addition, there are ever-present drones over Gaza, mostly the Heron and Hermes variants, using their surveillance equipment and cameras to track people, phone calls and potentially encrypted messages.

With Israeli satellite coverage and covert observation posts, this makes the 363 square kilometres of the Gaza Strip the most surveilled land in the world. It has also allowed the Israeli military to strike targets with astonishing speed.

Mission score

Lavender suspicion score is important because it is understood that if the AI picks up a “high-value target”, then the operators will be willing to accept significant collateral damage, that is the deaths of non-combatants, to kill a senior commander.

“They're doing that sort of risk calculation,” said Mr Kovar. “Rightly or wrongly, they are dialling back on the required confidence interval for taking those shots and I think that's part of the reason we've seen a lot of collateral damage.”

Israeli sources have confirmed that while the target information is rapidly digested by AI, F-15s, F-16s or F-35s will be circling overhead along with armed drones.

The Lavender operator, with input from Shin Bet intelligence, will then make the final click to authorise the strike, sending a missile rapidly hurtling towards the target.

“I’ve heard that the human operators would spend about 20 seconds to confirm a target, just to double-check that they were male,” said Ms Amaral.

That 20-second decision is what the military call a TCT (time constrained target) being picked up by AI and a strike has to be made as soon as possible. While these strikes are conducted on known Hamas operatives, on whom a lot of intelligence has been collected by Lavender, it is unclear what civilian casualties Israel is prepared to take to eliminate the person.

AI errors

Israel's “tolerance for errors is immensely high”, said Mr Sylvia. He said the data input did not account for “biases” in the people who created the model, with an argument that “decades of dehumanisation of Palestinians” might have influenced them.

The error factor was echoed by an Israeli source involved in AI and intelligence-gathering who The National interviewed. “This is war and people will always make mistakes under the stress of combat,” he said.

But suggesting that AI was taking out lots of innocent people was “fantastical” and unlikely, he argued. “Yes, Lavender is being used a lot but this has not created some dystopian future where machines are out of control,” the Israel officer insisted.

Lavender tweets

Despite the AI programme’s secrecy, analysis by The National showed that dating back to July last year, there had been more than 50 strikes published on the Israeli military's X account that it claimed were “intelligence based” and had used “additional intelligence”.

Many of the posts featured pictures or videos of strikes accompanied by the statement: “Following intelligence-based information indicating the presence of Hamas terrorists, the Israeli military conducted precise strikes on armed Hamas terrorists gathered at two different meeting points in southern Gaza.”

One video, from August 13 last year, shows two men carrying long-barrelled weapons, probably AK47s, walking behind a donkey cart. Seconds later, a missile strikes them, leaving the animal apparently unharmed in what is understood to have been AI-driven targeting.

Many “elimination” posts on X also show videos or pictures of Hamas members in Israel during the attacks on October 7, 2023, which experts believe were also fed into the Lavender database.

In a strike on Ahmed Alsauarka, a squad commander in the Nukhba force who participated in the October 7 killings, Israeli targeting on June 20 last year is thought to have assessed his gait and facial features before sending in the bomb that Israel claimed did not harm any civilians.

Israeli tanks are deployed at a position along the border with the Gaza Strip. AFP
Israeli tanks are deployed at a position along the border with the Gaza Strip. AFP

Israel's response

The Israeli military told The National that humans remained firmly in control and that Lavender did not dictate strikes. “The AI tools process information, there is no tool used by the military that creates a target, the human in the chain has to create the target [for the] Israeli military,” the army said.

“All target strikes are made under international law. We have never heard of Lavender putting forward targets that have not had human approval.”

It added that the AI was not “a generative machine that creates its own rules” but “a rules-based machine” and the sources that feed it information were always humans.

“Lavender takes a defined set of sources and there are people whose job is to make sure that the sources that are feeding Lavender are precise, accurate and have human control. It then creates a recommendation for who the intelligence officer should look into.”

Data feeds

But machine-generated killings at scale are a growing concern for those who have helped build these systems. The amount of intelligence generated by surveillance in the modern world, let alone warfare, is such that it is indigestible by humans. “It would take you days to go through just a single hour’s worth of footage,” said Mr Sylvia.

While data is key to Lavender’s effectiveness, the machine can only be as good as the information it is given. It cannot be blamed if it is fed faulty data.

Questions remain over the “digital literacy” of senior commanders who do not fully understand the nuances or shortcomings of AI. Ultimately, the experts say, the AI models will reflect the people that are using them.

Mr Kovar argued that “theoretically” AI could allow a much higher degree of accuracy with more rigorous target profiling given the information known about individuals in Gaza.

But machine learning also causes some uncertainty and possibly unchecked autonomy, as it is unknown if Lavender has “self-created” people who it believes are threats.

Machine legal?

That creates a concern over the legalities of using AI for military means, an entirely new area of warfare but one that will certainly take hold given its “success” in Gaza.

Matt Mahoudi, an adviser to Amnesty International on the legal use of AI in war, says Lavender is “totally in violation of international human rights and humanitarian law” and is a system that “erodes the presumption of innocence”.

“Lavender is based on unlawfully obtained mass surveillance data,” he added. “AI systems that turn up tens of thousands of targets on the basis of arbitrary data, would make any scientist say it’s flawed and discriminatory.”

Robert Buckland, a barrister and former Conservative cabinet minister, also raised the issue that the system was “only as good as the data” inputted and had the danger of being “incomplete, historic or out of date", which would then make it “rubbish”.

But that is countered by Ms Flasch’s argument of “military advantage” that would justify killing civilians if taking out a terrorist mastermind could conclude the war.

Machines supreme

Countries are developing technology quicker than laws can keep up with, said Lord Carlisle, a barrister and former British MP. “There is a degree of urgency about this,” he said, but it usually took “critical events to make decisions happen”.

That, he agreed, raised the Terminator scenario into which played the worrying prospect that it is now known that AI can hallucinate or lie.

“I don't think we're going to end up with Terminator, but my concern is that we're going to be in a more automated battlefield and get close to that Terminator scenario,” Mr Kovar said.

This feeds into Mr Daniels’ warning that “when AI fails, it fails horribly”. This could have catastrophic consequences, as machines "don't have a conscience”.

That means compassionless AI could “keep prosecuting a war to achieve the desired effect” whereas humans “at some point go ‘yeah, that's enough suffering’,” and end the conflict.

Lightning advances that Israel has made in AI during its war on Gaza and elsewhere have raised the stakes for future wars in which humans might have little control.

Some names have been changed to protect witness identity

SCHEDULE

December 8: UAE v USA (Sharjah Cricket Stadium)

December 9: USA v Scotland (Sharjah Cricket Stadium)

December 11: UAE v Scotland (Sharjah Cricket Stadium)

December 12: UAE v USA (ICC Academy Oval 1)

December 14: USA v Scotland (ICC Academy Oval 1)

December 15: UAE v Scotland (ICC Academy Oval 1)

All matches start at 10am

 

Mercer, the investment consulting arm of US services company Marsh & McLennan, expects its wealth division to at least double its assets under management (AUM) in the Middle East as wealth in the region continues to grow despite economic headwinds, a company official said.

Mercer Wealth, which globally has $160 billion in AUM, plans to boost its AUM in the region to $2-$3bn in the next 2-3 years from the present $1bn, said Yasir AbuShaban, a Dubai-based principal with Mercer Wealth.

Within the next two to three years, we are looking at reaching $2 to $3 billion as a conservative estimate and we do see an opportunity to do so,” said Mr AbuShaban.

Mercer does not directly make investments, but allocates clients’ money they have discretion to, to professional asset managers. They also provide advice to clients.

“We have buying power. We can negotiate on their (client’s) behalf with asset managers to provide them lower fees than they otherwise would have to get on their own,” he added.

Mercer Wealth’s clients include sovereign wealth funds, family offices, and insurance companies among others.

From its office in Dubai, Mercer also looks after Africa, India and Turkey, where they also see opportunity for growth.

Wealth creation in Middle East and Africa (MEA) grew 8.5 per cent to $8.1 trillion last year from $7.5tn in 2015, higher than last year’s global average of 6 per cent and the second-highest growth in a region after Asia-Pacific which grew 9.9 per cent, according to consultancy Boston Consulting Group (BCG). In the region, where wealth grew just 1.9 per cent in 2015 compared with 2014, a pickup in oil prices has helped in wealth generation.

BCG is forecasting MEA wealth will rise to $12tn by 2021, growing at an annual average of 8 per cent.

Drivers of wealth generation in the region will be split evenly between new wealth creation and growth of performance of existing assets, according to BCG.

Another general trend in the region is clients’ looking for a comprehensive approach to investing, according to Mr AbuShaban.

“Institutional investors or some of the families are seeing a slowdown in the available capital they have to invest and in that sense they are looking at optimizing the way they manage their portfolios and making sure they are not investing haphazardly and different parts of their investment are working together,” said Mr AbuShaban.

Some clients also have a higher appetite for risk, given the low interest-rate environment that does not provide enough yield for some institutional investors. These clients are keen to invest in illiquid assets, such as private equity and infrastructure.

“What we have seen is a desire for higher returns in what has been a low-return environment specifically in various fixed income or bonds,” he said.

“In this environment, we have seen a de facto increase in the risk that clients are taking in things like illiquid investments, private equity investments, infrastructure and private debt, those kind of investments were higher illiquidity results in incrementally higher returns.”

The Abu Dhabi Investment Authority, one of the largest sovereign wealth funds, said in its 2016 report that has gradually increased its exposure in direct private equity and private credit transactions, mainly in Asian markets and especially in China and India. The authority’s private equity department focused on structured equities owing to “their defensive characteristics.”

Uefa Nations League

League A:
Germany, Portugal, Belgium, Spain, France, England, Switzerland, Italy, Poland, Iceland, Croatia, Netherlands

League B:
Austria, Wales, Russia, Slovakia, Sweden, Ukraine, Republic of Ireland, Bosnia-Herzegovina, Northern Ireland, Denmark, Czech Republic, Turkey

League C:
Hungary, Romania, Scotland, Slovenia, Greece, Serbia, Albania, Norway, Montenegro, Israel, Bulgaria, Finland, Cyprus, Estonia, Lithuania

League D:
Azerbaijan, Macedonia, Belarus, Georgia, Armenia, Latvia, Faroe Islands, Luxembourg, Kazakhstan, Moldova, Liechtenstein, Malta, Andorra, Kosovo, San Marino, Gibraltar

How to apply for a drone permit
  • Individuals must register on UAE Drone app or website using their UAE Pass
  • Add all their personal details, including name, nationality, passport number, Emiratis ID, email and phone number
  • Upload the training certificate from a centre accredited by the GCAA
  • Submit their request
What are the regulations?
  • Fly it within visual line of sight
  • Never over populated areas
  • Ensure maximum flying height of 400 feet (122 metres) above ground level is not crossed
  • Users must avoid flying over restricted areas listed on the UAE Drone app
  • Only fly the drone during the day, and never at night
  • Should have a live feed of the drone flight
  • Drones must weigh 5 kg or less
Our legal consultants

Name: Hassan Mohsen Elhais

Position: legal consultant with Al Rowaad Advocates and Legal Consultants.

Company%20Profile
%3Cp%3E%3Cstrong%3ECompany%20name%3A%3C%2Fstrong%3E%20myZoi%3Cbr%3E%3Cstrong%3EStarted%3A%3C%2Fstrong%3E%202021%3Cbr%3E%3Cstrong%3EFounders%3A%3C%2Fstrong%3E%20Syed%20Ali%2C%20Christian%20Buchholz%2C%20Shanawaz%20Rouf%2C%20Arsalan%20Siddiqui%2C%20Nabid%20Hassan%3Cbr%3E%3Cstrong%3EBased%3A%3C%2Fstrong%3E%20UAE%3Cbr%3E%3Cstrong%3ENumber%20of%20staff%3A%3C%2Fstrong%3E%2037%3Cbr%3E%3Cstrong%3EInvestment%3A%3C%2Fstrong%3E%20Initial%20undisclosed%20funding%20from%20SC%20Ventures%3B%20second%20round%20of%20funding%20totalling%20%2414%20million%20from%20a%20consortium%20of%20SBI%2C%20a%20Japanese%20VC%20firm%2C%20and%20SC%20Venture%3C%2Fp%3E%0A
MEYDAN CARD

6.30pm Maiden Dh165,000 (Dirt) 1,600m

7.05pm Conditions Dh240,000 (D) 1,600m

7.40pm Handicap Dh190,000 (D) 2,000m

8.15pm Handicap Dh170,000 (D) 2,200m

8.50pm The Entisar Listed Dh265,000 (D) 2,000m

9.25pm The Garhoud Sprint Listed Dh265,000 (D) 1,200m

10pm Handicap Dh185,000 (D) 1,400m

 

The National selections

6.30pm Majestic Thunder

7.05pm Commanding

7.40pm Mark Of Approval

8.15pm Mulfit

8.50pm Gronkowski

9.25pm Walking Thunder

10pm Midnight Sands

Results:

5pm: Baynunah Conditions (UAE bred) Dh80,000 1,400m.

Winner: Al Tiryaq, Dane O’Neill (jockey), Abdullah Al Hammadi (trainer).

5.30pm: Al Zahra Handicap (rated 0-45) Dh 80,000 1,400m:

Winner: Fahadd, Richard Mullen, Ahmed Al Mehairbi.

6pm: Al Ras Al Akhdar Maiden Dh80,000 1,600m.

Winner: Jaahiz, Jesus Rosales, Eric Lemartinel.

6.30pm: Al Reem Island Handicap Dh90,000 1,600m.

Winner: AF Al Jahed, Antonio Fresu, Ernst Oertel.

7pm: Al Khubairah Handicap (TB) 100,000 2,200m.

Winner: Empoli, Pat Dobbs, Doug Watson.

7.30pm: Wathba Stallions Cup Handicap Dh80,000 2,200m.

Winner: Shivan OA, Patrick Cosgrave, Helal Al Alawi.

MATCH INFO

Sheffield United 3

Fleck 19, Mousset 52, McBurnie 90

Manchester United 3

Williams 72, Greenwood 77, Rashford 79

The National Archives, Abu Dhabi

Founded over 50 years ago, the National Archives collects valuable historical material relating to the UAE, and is the oldest and richest archive relating to the Arabian Gulf.

Much of the material can be viewed on line at the Arabian Gulf Digital Archive - https://www.agda.ae/en

Small%20Things%20Like%20These
%3Cp%3EDirector%3A%20Tim%20Mielants%3Cbr%3ECast%3A%20Cillian%20Murphy%2C%20Emily%20Watson%2C%20Eileen%20Walsh%3Cbr%3ERating%3A%204%2F5%3C%2Fp%3E%0A
Updated: August 06, 2025, 5:51 PM