The other night I was sitting in a cafe on University Avenue in Palo Alto, sipping a hot tea. I overheard an animated discussion a couple of tables down that started to get more heated. Two entrepreneurs disagreed on whether they should use data striping or 15K RPM disks for the new social network service they were launching.

I became more and more irritated by the discussion, because obviously these were not tech guys, yet they were talking about technical detail that required some knowledge they didn’t have. Indeed, they were comparing apples with oranges and backing up their arguments with quotes from famous entrepreneurs and venture capitalists from the Valley. They did not have any facts.

After a while, it became clear that they were planning to use a cloud service, which made their argument moot. It also became clear that their prototype was having a performance problem, not a reliability problem. The right move was really to hire a systems person who can instrument their code to do some serious performance analysis. A developer with good R skills was what they needed, not hearsay from famous people.

When it comes to data access systems, this lesson shouldn’t be lost: there are many opportunities to shoot yourself in the foot and kill performance. The key will be getting the right personnel in place before the foot-shooting even begins. When you architect a data access system, use the services of a seasoned performance analyst with a good R toolbox. It is guaranteed to save you many headaches down the road and avoid the executives-without-facts scenario I witnessed.

The entrepreneurs’ diatribe reminded me of ancient Greece, where there were two competing theories of human sight. One theory was called the emission theory (Euclid, Ptolemy) and claimed that vision worked by a little flame exiting the eye, traveling on rays, scanning the objects in the visual field, and traveling back to the eye reporting what they detected. In the intromission theory (Aristotle), when an object is looked at, it replicates itself and the replica travels along a ray into the viewer’s eye, where it is seen.

For a millennium, a debate raged over these two theories, and it was based purely on theoretical considerations and heuristics. In his 1015 book, Ibn Al-Haytham (or Alhazen) introduced the modern concept of the scientific research method based on experimentation and controlled testing that we still use today: a hypothesis is formulated, an experiment is conducted varying the parameters, the results of the experiment are discussed, and conclusions are drawn. Because of this, Ibn Al-Haytham is often referred to as the first scientist.

Using the scientific method, Ibn Al-Haytham developed the first plausible theory of vision. Among other contributions, he also explained the camera obscura and catoptrics. He has strongly influenced later scientists like Averroes, Leonardo da Vinci, Galileo Galilei, Christian Huygens, René Descartes, and Johannes Kepler.

In 2016, one year past the millennial anniversary of the scientific process, we should follow Al-Haytham’s example in the data access realm. Use the scientific method, and a flock of talented engineers and analysts, to come up with a data access strategy based on past performance data and systems analysis rather than coffee shop banter.