OOOGUI vs traditional CMSs

To achieve this flexibility for the developer, OOOGUI must do runtime a lot of tasks more than traditional CMS software. Many times more. And this could be a no-way issue if your priority is performance.

Here is a schema of all tasks compared between OOOGUI and traditional CMSs:


    • frontend tier:

      • 1 - for each requested webpage, fetch the page structure (wich modules are involved, and with wich layout structure)
      • 2 - for each module in the page stucture, get one or more queryId
      • 3 - for each queryId fetch the query definition
      • 4 - for each queryId fetch the structure of the data involved in the query
      • 5 - execute the query to find resulting record ids
      • 6 - for each record id returned by the query, fetch the data, according to structure of data fetched in step 4
      • 7 - assign data to a template for visualization
      • 8 - output the page

    • backend tier:

      • 1 - fetch the data model (fetch all data objects in the data model)
      • 2 - for each data object in the data model, fetch data properties
      • 3 - for each data property of a data object, generate appropriate interface with CRUD functionalities
      • 4 - generate each data object CRUD interface combining all its data property CRUD interfaces
      • 5 - generate the data model CRUD interface combining all its data object CRUD interfaces

  • Traditional CMS:

    • frontend tier:

      • 1 - for each requested webpage, execute the queries in it
      • 2 - assign all resulting data to a template for visualization
      • 3 - output the page

    • backend tier:

      • 1 - for each data object managed by the CMS there is a static CRUD interface, so that nothing has to be generated runtime. furthermore, there is no data model CRUD intarface, cause each modification to data model is not possible without (re)writing code.

So the price for flexibility is complexity and computation time. Is this a problem? It's a matter of numbers: we have tested successfully OOOGUI in websites with no more than few thousands of visits per day, we have no statistics for really heavy traffic scenarios. But the backend complexity is a non-issue, because backend will never generate comparable traffic respect to the frontend tier. And about the frontend - mainly the query engine - there were 2 prossible approaches:

  • the first was to dynamically generate a single big sql query to fetch whole data for an object
  • the second was to dynamically generate a lot of micro sql queries to fetch each single data of an object, and finally gather single data together to serve the whole data result.

We implemented both approaches, but pretty soon became evident that when data model complexity is just more than trivial, the first approach produces huge queries with too much sql joins, and a consequent exponentially exploding computation time. So we kept the second approach, and, even without a real stress statistic, computation time seems to increase linearly with data complexity, not exponentially, wich could be an acceptable compromise.