» Documentation


Download solution and unpack to "D:\LogDatabases". The solution includes cmd scripts, scimoredb binaries, plugins(import.dll and managed FeedReader.dll with source code), SQL Scripts and asp.NET files.

1. Installing ScimoreDB cluster

  • Install ScimoreDB Manager. launch ScimoreDB_x64.4.0.msi. Under features choose only Manager (Exclude Server feature!!!)
  • Install database nodes. Launch cmd.exe (with Admin Rights) and execute install.cmd (in D:\LogDatabases). The script creates 5 services. NOTE: to control memory usage modify the parameter "db/CachePages", each page uses 16KB.
  • Create a cluster. Launch ScimoreDB Manager.exe. Connect to local database(Server: localhost, Port:999). Open query window (ignore popping errors about cluster is not created) and execute "solution.zip\create_cluster.sql" SQL .
  • Change service account. Change scimoredb-0/5 services account to the account with the permissions to read sources (log files, windows event log, RSS/HTTP, SQL Server...), and start.
  • Install Microsoft Visual C++ 2010 SP1 Redistributable Package. Download from Microsoft.

2. Install Metadata

To install the metadata(database "LogSearch"), open ScimoreDB query window, and execute "solution.zip\SQLScripts\LogSearch.sql" content. Additionally, the script will create a multidimensional index to index facts (windows eventlog, application nlog,...). Database defines stored procedures to add sources and create dimensions.

3. Adding sources

When adding sources, there are few rules to follow:

  1. The source database must contain 6 tables called Log1 to Log6. As time progresses, facts populated to tables: Log1 -> Log2 ->.. -> Log6 -> Log1. Each Log table will held 30 days of data (configurable), and move to next Log table. When Log6 table moves, Log1 is truncated. You can change the 30 days value in metadata "ActiveBlockDetails" table column "BlockActiveDays". Don't forget to set PARTITION attribute per Log table, in order to distribute the facts in the cluster (see RSSFeed.sql)!

  2. Implement SynchronizeBlock stored procedure. The procedure called by metadata's procedure Synchronize. And it is meant to populate the facts. The parameter @blockId identify the active Log[1-6] table, and, @startLogId the sequence number - the point to index facts by multidimensional index. For example, if the SQL table contains Identity column, using it as sequence number will allow reading X initial rows first time, and then read new rows providing where clause identField > @startLogId. For log file sources, the sequence number will be the size of the log file. When the log file grows, we index only the new rows from the last obtained position.

RSS feed example

Add RSS feeds source. The example works out of the box. It is recommended example to start.

Windows event log and other sources

The examples of how to index windows event log. The example works out of the box too.


A few examples how to index other sources (SQL Server, Log text,..).

4. Synchronize

When source(s) (RSS feeds) has been added, perform the initial load (first time synchronization):

execute logSearch.Synchronize 

The Synchronize procedure will copy and index the changes. Note: it takes 1 minute before new fact appears in search (multidimensional index).

To verify the solution works, execute:

exec logSearch.SearchEntities '+Created:2013*' 

Periodic synchronize & index compact

For periodic updates, you may create windows scheduled task that executes Synchronize procedure:

  • Synchronize Task.

    • Trigger every 15 minutes.
    • Action: Start Program:

    Command line:

    "d:\logdatabases\isql.exe". Arguments:  --server=localhost --port=999 --database=LogSearch --file="D:\Logdatabases\SynchronizeActivityLog.sql" 
    • Optimize Indexes (it's important to compact index once in a while!).
      • Trigger at 23:00 (when least load).
      • Action: Start Program:

    Command line:

    "d:\logdatabases\isql.exe". Parameters: --server=localhost --port=999 --database=LogSearch --file="D:\Logdatabases\OptimizeIndexes.sql" 

5. Configure IIS

  • Copy to D:\Logdatabases\web from solution.zip/web.
  • Create Virtual Directory "Activity", mapping to D:\Logdatabases\web.
  • Choose Application Pool: "ASP.NET v4.0 (integrated)"
  • Access with Chrome browser!
    • http://localhost/activity/EntitySearch/entitiesSearch.aspx
    • http://localhost/activity/EntitySearch/report.aspx