Out of memory during build of large Computed Table

Comments

10 comments

  • Avatar
    John Allen (Edited )

    Some further details...


    The largest of these tables currently contains about 120 GB of data, split between about 60 Million rows @ ~2 KB per row. Our single Sisense server has (Edit, memory mis-quoted as 14 GB before) 28 GB of memory.

     

    This table is expected to grow much larger over time.

    0
    Comment actions Permalink
  • Avatar
    Malinda Jepsen

    John,

    Is there a way that you can create the custom table through a view or some other mechanism on the source? For example, in SQL Server, I do all of my work in views and then use them as the source in the Elasticube.  I'm not sure if this is possible in your scenario.

    Malinda

    0
    Comment actions Permalink
  • Avatar
    Andrew Block

    John,


    Just to piggy back on what Malinda said we do the same thing and our back-end is Oracle. We create materialized views in Oracle which aggregate and create the views we need and then read from in Sisense. 

    0
    Comment actions Permalink
  • Avatar
    John Allen (Edited )

    Hi Malinda,

     

    Unfortunately, I am unable to do this. The Custom table in Sisense is already a workaround to allow for accumulative build of the non-static source data.

     

    Moving it back to SQL Server defeats the purpose of the custom table, which is to calculate the latest version of the accumulated source data using rowversion and the primary key on the source table data, with the intention of filtering invalid data.

     

    Oddly enough, the source data is already a view...

     

    Thanks,

    John

    0
    Comment actions Permalink
  • Avatar
    John Allen

    Is it possible to allow the build process to use virtual memory to build custom tables? This would give us a lot more room for building this large table. This server has ~50GB of virtual memory available on SSD storage.

    0
    Comment actions Permalink
  • Avatar
    Andrew Block

    John,


    While I doubt Sisense would recommend it, have you tried playing around with the VM settings?

     

    0
    Comment actions Permalink
  • Avatar
    Malinda Jepsen

    John,

    If you are going to use this as your ElastiCube server, you'll probably want to increase the amount of memory anyway for queries. According to the minimum requirements (https://documentation.sisense.com/minimum-requirements-and-supported-platforms/), depending on the number of users you have, you will want 32-64G of memory.  For our high availability deployment, we have 2 cube servers, each with 486G of memory. We also have a build server (actually resides on our SQL Server for our warehouse). Our largest cube has been as large as 450G and we have around 100 concurrent users every day with more than 4,000 users overall. 

    Malinda

    0
    Comment actions Permalink
  • Avatar
    John Allen

    I have allocated all the virtual memory I can (~50 GB). The elasticube build process doesn't appear to use virtual memory.

    0
    Comment actions Permalink
  • 0
    Comment actions Permalink
  • Avatar
    Michael Becker

    Hi John,

    I'd suggest splitting that Custom SQL into several steps if possible, especially if you have nested selects in that statement. It will have a much lesser memory footprint on the memory.

    0
    Comment actions Permalink

Please sign in to leave a comment.