Data-driven deaths: How Israel's AI war machine pinpoints Palestinian victims


  • English
  • Arabic

Ali lived in what had been a relatively untouched neighbourhood in eastern Gaza city until the night of October 12, 2024, when from nowhere an Israeli bomb struck. Shaken but unhurt, the IT technician fled with his laptop to the Sheikh Radwan neighbourhood in the north of the city to live with his aunt.

Two weeks later shortly before midnight, the Palestinian was on his laptop working on the rooftop in search of a stronger signal to upload files via a VPN when he heard a drone circling overhead.

“It was closer than usual. Then seconds later I saw a red light coming down on to the rooftop right in front of me, no more than 20 metres away,” he says.

The blast threw him off his chair but he was largely unscathed. As he ran back down the stairs his aunt’s family shouted: “The strike was for you! What were you doing? Who were you communicating with? Who do you have connections with?” Ali’s uncle told him to pack his bags and leave.

He rushed to an IT expert friend’s house who suggested Ali’s activities had been analysed by artificial intelligence and he was flagged on the suspicious list “for my ‘unusual behaviour’ of working with international companies, using encryption programmes and spending long hours online”.

Amid the devastation of Gaza, Ali and many others believe there is an unseen, pervasive AI presence that is watching, listening and waiting for those on its target list to show their faces.

Israeli special combat soldiers conduct a training exercise using virtual reality battlefield technology in 2017. Getty Images
Israeli special combat soldiers conduct a training exercise using virtual reality battlefield technology in 2017. Getty Images

To survive, Ali now accesses the internet under strict security measures and in very short bursts. “Their AI systems see me as a potential threat and a target,” he says, sharing the fears of many trapped Palestinians that a machine is now determining their fate.

In another such tale, less than three minutes after two young men had entered the first floor of an apartment block, a bomb struck killing not only the pair but also Mohsen Obeid’s mother, father and three sisters.

Mr Obeid, 34, was devastated and baffled. His family had no links to Hamas or any other faction. “We were innocent civilians,” he later told The National.

It was only after the attack in May last year that a consoling neighbour told him that he had seen the two young men, “presumably from the resistance”, entering the house.

The Obeid family were in their second-floor flat in Al Faluja, north of Gaza city, completely unaware that Israel's state-of-the-art AI system had almost certainly used its immense data harvesting tools that gave the men a high “suspicion score”.

In an investigation into these Gaza deaths, The National found:

  • Israel operates a 20-second decision review known as a TCT (time constrained target) once a potential victim is picked up by the AI. These strikes are conducted on known Hamas operatives but also involve civilians
  • Israel’s state-of-the-art AI system uses data harvesting to give Gazans a high “suspicion score” that sets up a battlefield hit
  • Israel operates a series of AI systems that are routinely run with a level as low as 80 per cent confidence of confirming a legitimate target
  • Its known AI systems are Lavender, which raids data banks to generate potential confirmation of the target as a percentage; Gospel, which identifies static military targets such as tunnels; and Where’s Daddy?, which computes when a person is in a certain place
  • The target acquisition relies on facial recognition and other tools, including mapping a person’s gait and cross-checking identities
  • The target set includes as many as 37,000 Palestinians – compete with their photographs, videos, phone numbers and social media data – profiled in the systems.

Hoovering data

That, as much as anything else, sealed their fate. The system code-named Lavender had mined data collected over many years, said Chris Daniels, an AI specialist at Flare Bright, a machine-learning company.

“Gaza is a hugely surveilled place and Israel has taken that imagery, for however many years recording it all and feeding it into the system. They are just hoovering up all that data, visual facial recognition, phones, messages, internet, social media and because you've got two million people that is a huge amount for a machine-learning programme.”

Tal Hagin, an open-source intelligence analyst and software specialist based in Israel, believes there is an inflection point coming, if not already reached, where AI will make most battlefield decisions.

An Israeli drone flies over Rafah in the southern Gaza Strip. AFP
An Israeli drone flies over Rafah in the southern Gaza Strip. AFP

“The question is, are we at a point already where AI is taking over command, making decisions on the battlefield or are we still in the era of AI simply being an assistant, which is a huge difference.”

Certainly, in the early stages of the current Gaza conflict, the Israeli military was “eliminating different targets at a very, very increased speed” that would have required machine-generated information.

David Kovar, a cyber-AI specialist who has contracts with the US Department of Defence, said Israel had developed an enormous amount of their targeting information with AI “but they really weren't putting humans in the loop to validate whether these targets were legitimate”.

Human input

The strikes on Ali and the Obeid family out of the many thousands that have taken place in Gaza over the past 21 months raise serious concerns over the machine-driven mission Israel is carrying out. Does AI select the right people? Do humans have enough input? Are there any controls?

What The National’s investigation has found are not only questions over the accuracy of Lavender AI’s decision-making but also many civilians, like Mr Obeid’s family, who have apparently been killed by the system and that a high-collateral death toll is accepted from AI information.

Israeli forces are using high-tech weapons in Gaza. Getty Images
Israeli forces are using high-tech weapons in Gaza. Getty Images

The Israeli army, Mr Obeid said, “killed my family based on information generated by artificial intelligence” and without verifying if others were present in the building.

At the other end of the attack are Israeli military's AI "target officers" who are apparently content to go with an 80 per cent probability to confirm a target for strike, despite the collateral consequences, said Mr Daniels, also a former British army officer with strong Israeli military and intelligence connections.

“When it's 83.5 per cent and the human in the loop goes, ‘yes’ either that’s a good enough number – or if there's a low-value target, the number might be 95 per cent – but for a high-value target it could be as low as 60 per cent.”

The “tolerance for errors” within the Israeli command room, he has been told, was “immensely high”. “There's an element of dangerous errors pretty quickly if you remove the human,” he added.

The National spoke to Olivia Flasch, a lawyer who has advised the UN on the laws of armed conflict. She said: “It's prohibited to launch an attack that's expected to cause injury to civilians, that's excessive in relation to the concrete military advantage that is anticipated."

If a commander was 80 per cent sure that the target was the “mastermind of a terrorist organisation” and with him dead the war was likely to end, “that's a high military advantage”, she said. The assessment did not apply to rank-and-file fighters.

Sadly, for Mr Obeid and many others in Gaza, it appears the system named after the garden herb is now more redolent of death.

Killer systems

Lavender has been fed a vast mine of data including images taken from covert surveillance, open-source intelligence and information from the justice system of Palestinians that Israeli intelligence determined belonged to Hamas or other groups in Gaza.

The Lavender system does not necessarily generate targets but instead processes information that is generated and displayed for an intelligence officer. It is understood this then travels up a chain of command to a higher-ranking intelligence officer who will take into account civilian casualties when authorising a strike mission.

The National is also aware that there are other AI systems used by Israel whose codenames have not yet been disclosed – it is unclear what their capabilities are, such as in terms of precision targeting.

Insiders worry that Lavender has spawned a form of warfare where the human touch is largely absent at vital points.

An Israeli drone pilot beside a Hermes 900 unmanned aerial vehicle at Palmachim Airbase. Getty Images
An Israeli drone pilot beside a Hermes 900 unmanned aerial vehicle at Palmachim Airbase. Getty Images

Mr Kovar's information suggested that if a person was spotted above ground, moving between buildings, and AI had 80 per cent confidence this was a legitimate target, “they're going to take that shot”, he said, despite the risk of “collateral damage”.

The National has spoken to security sources, experts and viewed open-source intelligence piecing together how the system works from acquiring a target to their “elimination”.

When a person with a high “suspicion score” has their face recognised and location identified by AI, machine-driven analysis goes to work. This will include studying the person’s gait, their location and, using Gospel, their expected destination, alongside a wealth of other data processed within seconds.

Mr Kovar said considerable effort had been "put into human facial and gait recognition, how people walk and move, and where they’re going”.

The system also significantly speeds up the ability make observe, orientate, decide and act (Ooda) loop decisions, allowing for a rapid military response.

“If AI can get you through that Ooda loop, from an image to identifying who the human is, then saying, ‘OK, we're going to take the shot faster than a human can do it’ particularly [if] a human has to go to check with higher-ups, then they're going to use the AI," he said.

The assembled Lavender information goes to an operator, giving the potential confirmation of the target as a percentage (for example, 83.5 per cent) alongside their suspicion score, suggesting how senior a figure might be.

The AI system also significantly increases the speed of response to hitting a target without having to wait for authorisation from senior officers.

Drone footage of Hamas leader Yahya Sinwar moments before he was killed. AFP
Drone footage of Hamas leader Yahya Sinwar moments before he was killed. AFP

The Lavender system was first disclosed by the Israeli outlet +972 in April last year, with Israeli sources claiming that operators were permitted to kill 15 or 20 civilians to eliminate even low-ranking Hamas members.

The Gospel AI system first appeared on Israeli armed forces' websites in 2021, describing an algorithm-based tool that identifies static military targets such as tunnels or fighters’ homes, that can assist a rapid response if a suspect enters them. This has then been aligned to another system called Where’s Daddy? that can compute when a person is in a certain place.

“Where’s Daddy? is used to track individuals that have been targeted by Lavender and it strikes individuals once they've entered their homes,” said AI specialist Nilza Amaral, of the Chatham House think tank. This might explain why most of Israel's attacks on AI-identified targets take place on buildings.

Key champion

Outlandish as that might seem, Brig Yossi Sariel, head of Israel’s Unit 8200, the specialist team that introduced Lavender, wrote a book called The Human-Machine Team: How to Create Synergy Between Human & Artificial Intelligence That Will Revolutionise Our World.

In it, he describes a “target machine” which processes people’s connections and movements via social media, mobile phone tracking and household addresses.

Brig Sariel is allegedly a key driver behind the use of AI and, while the system’s precise workings remain highly secretive, it is understood Lavender generates a numerical “suspicion score” that, if high enough, will lead to a target for elimination.

Generating that high suspicion score makes death in Gaza a near inevitability, whether you were a member of Hamas or Palestinian Islamic Jihad or neither.

The strikes on the fighters, particularly in the early months of the campaign, were incessant, contributing to the body count that now stands at more than 57,570, with up to 20,000 of those combatants.

A Palestinian girl looks up at military drones circling Rafah refugee camp. AFP
A Palestinian girl looks up at military drones circling Rafah refugee camp. AFP

Iran nuclear origins

The evolution of this terrifyingly efficient killing machine that Israel has created will affect future wars and can be traced back to the Iranian nuclear scientists’ assassination programme that, before Israel’s air strikes last month, culminated in the killing of Prof Mohsen Fakhrizadeh in 2020.

After that remote attack, Israel knew they could successfully target someone in distant Iran, so why not on their own doorstep? The success of the facial recognition in Iran drove a new advance in warfare spawning the Lavender system.

Iranian scientist Mohsen Fakhrizadeh was killed in an attack on his car in 2020. Wikimedia Commons
Iranian scientist Mohsen Fakhrizadeh was killed in an attack on his car in 2020. Wikimedia Commons

Currency exchange killing

Ramy’s family has two shops, one each in Gaza city and Rafah, a currency exchange business that they have run for 50 years. So it was a shock when in early December 2023 their building in Gaza city took a direct hit from a drone-fired missile that fortunately failed to explode.

Confused, they had no idea why they were targeted because they believed the Israeli military would strike only if they had a specific reason.

Two weeks later, the branch reopened but within another two days it was struck again and this time the missile detonated, killing Ramy's brother Mohammed, two employees and several bystanders.

They suspended in-person services but then in April 2024 one of their data entry assistants was killed near his home by an Israeli bomb. It later transpired that the employee, who had no role in money transfers, had been affiliated to Hamas for some time.

With the two attacks on their business, Ramy was certain the Gospel and Lavender had identified and tracked the employee to their business premises. “But the artificial intelligence didn't take into account the presence of dozens of civilian casualties that would result from targeting him in a commercial location,” he said.

“My brother died in that strike, even though he had absolutely no connection to any faction,” he added. “He was martyred simply because he happened to be next to someone, who wasn’t some high-ranking [Hamas] figure – just a regular guy with a political affiliation.”

Instances such as this raise doubts over trusting AI’s judgment on who precisely is "the enemy”. Noah Sylvia, a research analyst for emerging military technology at the Royal United Services Institute, concurs.

The Israeli military insists that human analysts verify every target, yet he raises the serious issue that “we don't know whether or not the [AI] models are creating the targets themselves”.

Ms Amaral agrees. “There is no requirement for checking how the machine is making those decisions to select targets,” she said. “Because it seems there are many, many people who aren't involved in military operations that have been killed”.

As many as 37,000 Palestinians – compete with their photographs, videos, telephone and social media data – have reportedly had their data entered on to the Lavender system.

“The Israelis created as many targets as they could and put them in a bank that would have tens of thousands of targets, because they were always expecting the next war with Hamas,” said Mr Sylvia.

Damaged buildings and ruins in northern Gaza, as seen from the Israeli side of the border. Reuters
Damaged buildings and ruins in northern Gaza, as seen from the Israeli side of the border. Reuters

Fusion warfare

Among the new technology introduced to Gaza is an upgraded tank, the Merkava 5 Barak, which was fitted with AI, sensors, radar and small cameras before deployment.

Inside the Barak are touch screens to input information that allows soldiers to rapidly transfer data to the AI “target bank” that is fed to an operations room at a secret location. “These tanks are big-sensor platforms sucking in all the data,” said AI analyst Mr Daniels.

In addition, there are ever-present drones over Gaza, mostly the Heron and Hermes variants, using their surveillance equipment and cameras to track people, phone calls and potentially encrypted messages.

With Israeli satellite coverage and covert observation posts, this makes the 363 square kilometres of the Gaza Strip the most surveilled land in the world. It has also allowed the Israeli military to strike targets with astonishing speed.

Mission score

Lavender suspicion score is important because it is understood that if the AI picks up a “high-value target”, then the operators will be willing to accept significant collateral damage, that is the deaths of non-combatants, to kill a senior commander.

“They're doing that sort of risk calculation,” said Mr Kovar. “Rightly or wrongly, they are dialling back on the required confidence interval for taking those shots and I think that's part of the reason we've seen a lot of collateral damage.”

Israeli sources have confirmed that while the target information is rapidly digested by AI, F-15s, F-16s or F-35s will be circling overhead along with armed drones.

The Lavender operator, with input from Shin Bet intelligence, will then make the final click to authorise the strike, sending a missile rapidly hurtling towards the target.

“I’ve heard that the human operators would spend about 20 seconds to confirm a target, just to double-check that they were male,” said Ms Amaral.

That 20-second decision is what the military call a TCT (time constrained target) being picked up by AI and a strike has to be made as soon as possible. While these strikes are conducted on known Hamas operatives, on whom a lot of intelligence has been collected by Lavender, it is unclear what civilian casualties Israel is prepared to take to eliminate the person.

AI errors

Israel's “tolerance for errors is immensely high”, said Mr Sylvia. He said the data input did not account for “biases” in the people who created the model, with an argument that “decades of dehumanisation of Palestinians” might have influenced them.

The error factor was echoed by an Israeli source involved in AI and intelligence-gathering who The National interviewed. “This is war and people will always make mistakes under the stress of combat,” he said.

But suggesting that AI was taking out lots of innocent people was “fantastical” and unlikely, he argued. “Yes, Lavender is being used a lot but this has not created some dystopian future where machines are out of control,” the Israel officer insisted.

Lavender tweets

Despite the AI programme’s secrecy, analysis by The National showed that dating back to July last year, there had been more than 50 strikes published on the Israeli military's X account that it claimed were “intelligence based” and had used “additional intelligence”.

Many of the posts featured pictures or videos of strikes accompanied by the statement: “Following intelligence-based information indicating the presence of Hamas terrorists, the Israeli military conducted precise strikes on armed Hamas terrorists gathered at two different meeting points in southern Gaza.”

One video, from August 13 last year, shows two men carrying long-barrelled weapons, probably AK47s, walking behind a donkey cart. Seconds later, a missile strikes them, leaving the animal apparently unharmed in what is understood to have been AI-driven targeting.

Many “elimination” posts on X also show videos or pictures of Hamas members in Israel during the attacks on October 7, 2023, which experts believe were also fed into the Lavender database.

In a strike on Ahmed Alsauarka, a squad commander in the Nukhba force who participated in the October 7 killings, Israeli targeting on June 20 last year is thought to have assessed his gait and facial features before sending in the bomb that Israel claimed did not harm any civilians.

Israeli tanks are deployed at a position along the border with the Gaza Strip. AFP
Israeli tanks are deployed at a position along the border with the Gaza Strip. AFP

Israel's response

The Israeli military told The National that humans remained firmly in control and that Lavender did not dictate strikes. “The AI tools process information, there is no tool used by the military that creates a target, the human in the chain has to create the target [for the] Israeli military,” the army said.

“All target strikes are made under international law. We have never heard of Lavender putting forward targets that have not had human approval.”

It added that the AI was not “a generative machine that creates its own rules” but “a rules-based machine” and the sources that feed it information were always humans.

“Lavender takes a defined set of sources and there are people whose job is to make sure that the sources that are feeding Lavender are precise, accurate and have human control. It then creates a recommendation for who the intelligence officer should look into.”

Data feeds

But machine-generated killings at scale are a growing concern for those who have helped build these systems. The amount of intelligence generated by surveillance in the modern world, let alone warfare, is such that it is indigestible by humans. “It would take you days to go through just a single hour’s worth of footage,” said Mr Sylvia.

While data is key to Lavender’s effectiveness, the machine can only be as good as the information it is given. It cannot be blamed if it is fed faulty data.

Questions remain over the “digital literacy” of senior commanders who do not fully understand the nuances or shortcomings of AI. Ultimately, the experts say, the AI models will reflect the people that are using them.

Mr Kovar argued that “theoretically” AI could allow a much higher degree of accuracy with more rigorous target profiling given the information known about individuals in Gaza.

But machine learning also causes some uncertainty and possibly unchecked autonomy, as it is unknown if Lavender has “self-created” people who it believes are threats.

Machine legal?

That creates a concern over the legalities of using AI for military means, an entirely new area of warfare but one that will certainly take hold given its “success” in Gaza.

Matt Mahoudi, an adviser to Amnesty International on the legal use of AI in war, says Lavender is “totally in violation of international human rights and humanitarian law” and is a system that “erodes the presumption of innocence”.

“Lavender is based on unlawfully obtained mass surveillance data,” he added. “AI systems that turn up tens of thousands of targets on the basis of arbitrary data, would make any scientist say it’s flawed and discriminatory.”

Robert Buckland, a barrister and former Conservative cabinet minister, also raised the issue that the system was “only as good as the data” inputted and had the danger of being “incomplete, historic or out of date", which would then make it “rubbish”.

But that is countered by Ms Flasch’s argument of “military advantage” that would justify killing civilians if taking out a terrorist mastermind could conclude the war.

Machines supreme

Countries are developing technology quicker than laws can keep up with, said Lord Carlisle, a barrister and former British MP. “There is a degree of urgency about this,” he said, but it usually took “critical events to make decisions happen”.

That, he agreed, raised the Terminator scenario into which played the worrying prospect that it is now known that AI can hallucinate or lie.

“I don't think we're going to end up with Terminator, but my concern is that we're going to be in a more automated battlefield and get close to that Terminator scenario,” Mr Kovar said.

This feeds into Mr Daniels’ warning that “when AI fails, it fails horribly”. This could have catastrophic consequences, as machines "don't have a conscience”.

That means compassionless AI could “keep prosecuting a war to achieve the desired effect” whereas humans “at some point go ‘yeah, that's enough suffering’,” and end the conflict.

Lightning advances that Israel has made in AI during its war on Gaza and elsewhere have raised the stakes for future wars in which humans might have little control.

Some names have been changed to protect witness identity

A Long Way Home by Peter Carey
Faber & Faber

The Bio

Hometown: Bogota, Colombia
Favourite place to relax in UAE: the desert around Al Mleiha in Sharjah or the eastern mangroves in Abu Dhabi
The one book everyone should read: 100 Years of Solitude by Gabriel Garcia Marquez. It will make your mind fly
Favourite documentary: Chasing Coral by Jeff Orlowski. It's a good reality check about one of the most valued ecosystems for humanity

In numbers: PKK’s money network in Europe

Germany: PKK collectors typically bring in $18 million in cash a year – amount has trebled since 2010

Revolutionary tax: Investigators say about $2 million a year raised from ‘tax collection’ around Marseille

Extortion: Gunman convicted in 2023 of demanding $10,000 from Kurdish businessman in Stockholm

Drug trade: PKK income claimed by Turkish anti-drugs force in 2024 to be as high as $500 million a year

Denmark: PKK one of two terrorist groups along with Iranian separatists ASMLA to raise “two-digit million amounts”

Contributions: Hundreds of euros expected from typical Kurdish families and thousands from business owners

TV channel: Kurdish Roj TV accounts frozen and went bankrupt after Denmark fined it more than $1 million over PKK links in 2013 

The specs
 
Engine: 3.0-litre six-cylinder turbo
Power: 398hp from 5,250rpm
Torque: 580Nm at 1,900-4,800rpm
Transmission: Eight-speed auto
Fuel economy, combined: 6.5L/100km
On sale: December
Price: From Dh330,000 (estimate)
While you're here
Milestones on the road to union

1970

October 26: Bahrain withdraws from a proposal to create a federation of nine with the seven Trucial States and Qatar. 

December: Ahmed Al Suwaidi visits New York to discuss potential UN membership.

1971

March 1:  Alex Douglas Hume, Conservative foreign secretary confirms that Britain will leave the Gulf and “strongly supports” the creation of a Union of Arab Emirates.

July 12: Historic meeting at which Sheikh Zayed and Sheikh Rashid make a binding agreement to create what will become the UAE.

July 18: It is announced that the UAE will be formed from six emirates, with a proposed constitution signed. RAK is not yet part of the agreement.

August 6:  The fifth anniversary of Sheikh Zayed becoming Ruler of Abu Dhabi, with official celebrations deferred until later in the year.

August 15: Bahrain becomes independent.

September 3: Qatar becomes independent.

November 23-25: Meeting with Sheikh Zayed and Sheikh Rashid and senior British officials to fix December 2 as date of creation of the UAE.

November 29:  At 5.30pm Iranian forces seize the Greater and Lesser Tunbs by force.

November 30: Despite  a power sharing agreement, Tehran takes full control of Abu Musa. 

November 31: UK officials visit all six participating Emirates to formally end the Trucial States treaties

December 2: 11am, Dubai. New Supreme Council formally elects Sheikh Zayed as President. Treaty of Friendship signed with the UK. 11.30am. Flag raising ceremony at Union House and Al Manhal Palace in Abu Dhabi witnessed by Sheikh Khalifa, then Crown Prince of Abu Dhabi.

December 6: Arab League formally admits the UAE. The first British Ambassador presents his credentials to Sheikh Zayed.

December 9: UAE joins the United Nations.

RACE CARD

4pm Al Bastakiya – Listed (TB) $150,000 (Dirt) 1,900m

4.35pm Dubai City Of Gold – Group 2 (TB) $228,000 (Turf) 2,410m

5.10pm Mahab Al Shimaal – Group 3 (TB) $228,000 (D) 1,200m

5.45pm Burj Nahaar – Group 3 (TB) $228,000 (D) 1,600m

6.20pm Jebel Hatta – Group 1 (TB) $260,000 (T) 1,800m

6.55pm Al Maktoum Challenge Round-1 – Group 1 (TB) $390,000 (D) 2,000m

7.30pm Nad Al Sheba – Group 3 (TB) $228,000 (T) 1,200m

UAE%20ILT20
%3Cp%3E%3Cstrong%3EMarquee%20players%3A%3C%2Fstrong%3E%3Cbr%3EMoeen%20Ali%2C%20Andre%20Russell%2C%20Dawid%20Malan%2C%20Wanindu%20Hasiranga%2C%20Sunil%20Narine%2C%20Evin%20Lewis%2C%20Colin%20Munro%2C%20Fabien%20Allen%2C%20Sam%20Billings%2C%20Tom%20Curran%2C%20Alex%20Hales%2C%20Dushmantha%20Chameera%2C%20Shimron%20Hetmyer%2C%20Akeal%20Hosein%2C%20Chris%20Jordan%2C%20Tom%20Banton%2C%20Sandeep%20Lamichhane%2C%20Chris%20Lynn%2C%20Rovman%20Powell%2C%20Bhanuka%20Rajapaksa%2C%20Mujeeb%20Ul%20Rahman%3C%2Fp%3E%0A%3Cp%3E%3Cstrong%3EInternational%20players%3A%3C%2Fstrong%3E%3C%2Fp%3E%0A%3Cp%3ELahiru%20Kumara%2C%20Seekugge%20Prassanna%2C%20Charith%20Asalanka%2C%20Colin%20Ingram%2C%20Paul%20Stirling%2C%20Kennar%20Lewis%2C%20Ali%20Khan%2C%20Brandon%20Glover%2C%20Ravi%20Rampaul%2C%20Raymon%20Reifer%2C%20Isuru%20Udana%2C%20Blessing%20Muzarabani%2C%20Niroshan%20Dickwella%2C%20Hazaratullah%20Zazai%2C%20Frederick%20Klassen%2C%20Sikandar%20Raja%2C%20George%20Munsey%2C%20Dan%20Lawrence%2C%20Dominic%20Drakes%2C%20Jamie%20Overton%2C%20Liam%20Dawson%2C%20David%20Wiese%2C%20Qais%20Ahmed%2C%20Richard%20Gleeson%2C%20James%20Vince%2C%20Noor%20Ahmed%2C%20Rahmanullah%20Gurbaz%2C%20Navin%20Ul%20Haq%2C%20Sherfane%20Rutherford%2C%20Saqib%20Mahmood%2C%20Ben%20Duckett%2C%20Benny%20Howell%2C%20Ruben%20Trumpelman%3C%2Fp%3E%0A
Company profile

Date started: 2015

Founder: John Tsioris and Ioanna Angelidaki

Based: Dubai

Sector: Online grocery delivery

Staff: 200

Funding: Undisclosed, but investors include the Jabbar Internet Group and Venture Friends

F1 The Movie

Starring: Brad Pitt, Damson Idris, Kerry Condon, Javier Bardem

Director: Joseph Kosinski

Rating: 4/5

The White Lotus: Season three

Creator: Mike White

Starring: Walton Goggins, Jason Isaacs, Natasha Rothwell

Rating: 4.5/5

Dengue%20fever%20symptoms
%3Cp%3EHigh%20fever%20(40%C2%B0C%2F104%C2%B0F)%3Cbr%3ESevere%20headache%3Cbr%3EPain%20behind%20the%20eyes%3Cbr%3EMuscle%20and%20joint%20pains%3Cbr%3ENausea%3Cbr%3EVomiting%3Cbr%3ESwollen%20glands%3Cbr%3ERash%26nbsp%3B%3C%2Fp%3E%0A
Strait of Hormuz

Fujairah is a crucial hub for fuel storage and is just outside the Strait of Hormuz, a vital shipping route linking Middle East oil producers to markets in Asia, Europe, North America and beyond.

The strait is 33 km wide at its narrowest point, but the shipping lane is just three km wide in either direction. Almost a fifth of oil consumed across the world passes through the strait.

Iran has repeatedly threatened to close the strait, a move that would risk inviting geopolitical and economic turmoil.

Last month, Iran issued a new warning that it would block the strait, if it was prevented from using the waterway following a US decision to end exemptions from sanctions for major Iranian oil importers.

Ms Yang's top tips for parents new to the UAE
  1. Join parent networks
  2. Look beyond school fees
  3. Keep an open mind
ANDROID%20VERSION%20NAMES%2C%20IN%20ORDER
%3Cp%3EAndroid%20Alpha%3C%2Fp%3E%0A%3Cp%3EAndroid%20Beta%3C%2Fp%3E%0A%3Cp%3EAndroid%20Cupcake%3C%2Fp%3E%0A%3Cp%3EAndroid%20Donut%3C%2Fp%3E%0A%3Cp%3EAndroid%20Eclair%3C%2Fp%3E%0A%3Cp%3EAndroid%20Froyo%3C%2Fp%3E%0A%3Cp%3EAndroid%20Gingerbread%3C%2Fp%3E%0A%3Cp%3EAndroid%20Honeycomb%3C%2Fp%3E%0A%3Cp%3EAndroid%20Ice%20Cream%20Sandwich%3C%2Fp%3E%0A%3Cp%3EAndroid%20Jelly%20Bean%3C%2Fp%3E%0A%3Cp%3EAndroid%20KitKat%3C%2Fp%3E%0A%3Cp%3EAndroid%20Lollipop%3C%2Fp%3E%0A%3Cp%3EAndroid%20Marshmallow%3C%2Fp%3E%0A%3Cp%3EAndroid%20Nougat%3C%2Fp%3E%0A%3Cp%3EAndroid%20Oreo%3C%2Fp%3E%0A%3Cp%3EAndroid%20Pie%3C%2Fp%3E%0A%3Cp%3EAndroid%2010%20(Quince%20Tart*)%3C%2Fp%3E%0A%3Cp%3EAndroid%2011%20(Red%20Velvet%20Cake*)%3C%2Fp%3E%0A%3Cp%3EAndroid%2012%20(Snow%20Cone*)%3C%2Fp%3E%0A%3Cp%3EAndroid%2013%20(Tiramisu*)%3C%2Fp%3E%0A%3Cp%3EAndroid%2014%20(Upside%20Down%20Cake*)%3C%2Fp%3E%0A%3Cp%3EAndroid%2015%20(Vanilla%20Ice%20Cream*)%3C%2Fp%3E%0A%3Cp%3E%3Cem%3E*%20internal%20codenames%3C%2Fem%3E%3C%2Fp%3E%0A
How to wear a kandura

Dos

  • Wear the right fabric for the right season and occasion 
  • Always ask for the dress code if you don’t know
  • Wear a white kandura, white ghutra / shemagh (headwear) and black shoes for work 
  • Wear 100 per cent cotton under the kandura as most fabrics are polyester

Don’ts 

  • Wear hamdania for work, always wear a ghutra and agal 
  • Buy a kandura only based on how it feels; ask questions about the fabric and understand what you are buying
Israel Palestine on Swedish TV 1958-1989

Director: Goran Hugo Olsson

Rating: 5/5

Formula%204%20Italian%20Championship%202023%20calendar
%3Cp%3EApril%2021-23%3A%20Imola%3Cbr%3EMay%205-7%3A%20Misano%3Cbr%3EMay%2026-28%3A%20SPA-Francorchamps%3Cbr%3EJune%2023-25%3A%20Monza%3Cbr%3EJuly%2021-23%3A%20Paul%20Ricard%3Cbr%3ESept%2029-Oct%201%3A%20Mugello%3Cbr%3EOct%2013-15%3A%20Vallelunga%3C%2Fp%3E%0A

THE SPECS

2020 Toyota Corolla Hybrid LE

Engine: 1.8 litre combined with 16-volt electric motors

Transmission: Automatic with manual shifting mode

Power: 121hp

Torque: 142Nm

Price: Dh95,900

BMW M5 specs

Engine: 4.4-litre twin-turbo V-8 petrol enging with additional electric motor

Power: 727hp

Torque: 1,000Nm

Transmission: 8-speed auto

Fuel consumption: 10.6L/100km

On sale: Now

Price: From Dh650,000

The bio:

Favourite film:

Declan: It was The Commitments but now it’s Bohemian Rhapsody.

Heidi: The Long Kiss Goodnight.

Favourite holiday destination:

Declan: Las Vegas but I also love getting home to Ireland and seeing everyone back home.

Heidi: Australia but my dream destination would be to go to Cuba.

Favourite pastime:

Declan: I love brunching and socializing. Just basically having the craic.

Heidi: Paddleboarding and swimming.

Personal motto:

Declan: Take chances.

Heidi: Live, love, laugh and have no regrets.

 

Hydrogen: Market potential

Hydrogen has an estimated $11 trillion market potential, according to Bank of America Securities and is expected to generate $2.5tn in direct revenues and $11tn of indirect infrastructure by 2050 as its production increases six-fold.

"We believe we are reaching the point of harnessing the element that comprises 90 per cent of the universe, effectively and economically,” the bank said in a recent report.

Falling costs of renewable energy and electrolysers used in green hydrogen production is one of the main catalysts for the increasingly bullish sentiment over the element.

The cost of electrolysers used in green hydrogen production has halved over the last five years and will fall to 60 to 90 per cent by the end of the decade, acceding to Haim Israel, equity strategist at Merrill Lynch. A global focus on decarbonisation and sustainability is also a big driver in its development.

COMPANY%20PROFILE
%3Cp%3E%3Cstrong%3ECompany%20name%3A%3C%2Fstrong%3E%20Revibe%20%0D%3Cbr%3E%3Cstrong%3EStarted%3A%3C%2Fstrong%3E%202022%0D%3Cbr%3E%3Cstrong%3EFounders%3A%3C%2Fstrong%3E%20Hamza%20Iraqui%20and%20Abdessamad%20Ben%20Zakour%20%0D%3Cbr%3E%3Cstrong%3EBased%3A%3C%2Fstrong%3E%20UAE%20%0D%3Cbr%3E%3Cstrong%3EIndustry%3A%3C%2Fstrong%3E%20Refurbished%20electronics%20%0D%3Cbr%3E%3Cstrong%3EFunds%20raised%20so%20far%3A%3C%2Fstrong%3E%20%2410m%20%0D%3Cbr%3E%3Cstrong%3EInvestors%3A%20%3C%2Fstrong%3EFlat6Labs%2C%20Resonance%20and%20various%20others%0D%3C%2Fp%3E%0A
Company%20Profile
%3Cp%3E%3Cstrong%3ECompany%20name%3A%3C%2Fstrong%3E%20Cargoz%3Cbr%3E%3Cstrong%3EDate%20started%3A%3C%2Fstrong%3E%20January%202022%3Cbr%3E%3Cstrong%3EFounders%3A%3C%2Fstrong%3E%20Premlal%20Pullisserry%20and%20Lijo%20Antony%3Cbr%3E%3Cstrong%3EBased%3A%3C%2Fstrong%3E%20Dubai%3Cbr%3E%3Cstrong%3ENumber%20of%20staff%3A%3C%2Fstrong%3E%2030%3Cbr%3E%3Cstrong%3EInvestment%20stage%3A%3C%2Fstrong%3E%20Seed%3C%2Fp%3E%0A
Dr Afridi's warning signs of digital addiction

Spending an excessive amount of time on the phone.

Neglecting personal, social, or academic responsibilities.

Losing interest in other activities or hobbies that were once enjoyed.

Having withdrawal symptoms like feeling anxious, restless, or upset when the technology is not available.

Experiencing sleep disturbances or changes in sleep patterns.

What are the guidelines?

Under 18 months: Avoid screen time altogether, except for video chatting with family.

Aged 18-24 months: If screens are introduced, it should be high-quality content watched with a caregiver to help the child understand what they are seeing.

Aged 2-5 years: Limit to one-hour per day of high-quality programming, with co-viewing whenever possible.

Aged 6-12 years: Set consistent limits on screen time to ensure it does not interfere with sleep, physical activity, or social interactions.

Teenagers: Encourage a balanced approach – screens should not replace sleep, exercise, or face-to-face socialisation.

Source: American Paediatric Association
CHATGPT%20ENTERPRISE%20FEATURES
%3Cp%3E%E2%80%A2%20Enterprise-grade%20security%20and%20privacy%3C%2Fp%3E%0A%3Cp%3E%E2%80%A2%20Unlimited%20higher-speed%20GPT-4%20access%20with%20no%20caps%3C%2Fp%3E%0A%3Cp%3E%E2%80%A2%20Longer%20context%20windows%20for%20processing%20longer%20inputs%3C%2Fp%3E%0A%3Cp%3E%E2%80%A2%20Advanced%20data%20analysis%20capabilities%3C%2Fp%3E%0A%3Cp%3E%E2%80%A2%20Customisation%20options%3C%2Fp%3E%0A%3Cp%3E%E2%80%A2%20Shareable%20chat%20templates%20that%20companies%20can%20use%20to%20collaborate%20and%20build%20common%20workflows%3C%2Fp%3E%0A%3Cp%3E%E2%80%A2%20Analytics%20dashboard%20for%20usage%20insights%3C%2Fp%3E%0A%3Cp%3E%E2%80%A2%20Free%20credits%20to%20use%20OpenAI%20APIs%20to%20extend%20OpenAI%20into%20a%20fully-custom%20solution%20for%20enterprises%3C%2Fp%3E%0A
Specs
Engine: Electric motor generating 54.2kWh (Cooper SE and Aceman SE), 64.6kW (Countryman All4 SE)
Power: 218hp (Cooper and Aceman), 313hp (Countryman)
Torque: 330Nm (Cooper and Aceman), 494Nm (Countryman)
On sale: Now
Price: From Dh158,000 (Cooper), Dh168,000 (Aceman), Dh190,000 (Countryman)
Greatest Royal Rumble results

John Cena pinned Triple H in a singles match

Cedric Alexander retained the WWE Cruiserweight title against Kalisto

Matt Hardy and Bray Wyatt win the Raw Tag Team titles against Cesaro and Sheamus

Jeff Hardy retained the United States title against Jinder Mahal

Bludgeon Brothers retain the SmackDown Tag Team titles against the Usos

Seth Rollins retains the Intercontinental title against The Miz, Finn Balor and Samoa Joe

AJ Styles remains WWE World Heavyweight champion after he and Shinsuke Nakamura are both counted out

The Undertaker beats Rusev in a casket match

Brock Lesnar retains the WWE Universal title against Roman Reigns in a steel cage match

Braun Strowman won the 50-man Royal Rumble by eliminating Big Cass last

UAE currency: the story behind the money in your pockets
Updated: August 06, 2025, 5:51 PM