Strus Lua Bindings
0.14.0
Patrick P. Frey
Mozilla Public License v. 2.0 (MPLv2)
Lua interface for strus, a set of libraries and programs to build a text search engine

strus_Context

Object holding the global context of the strus information retrieval engine

strus_StorageClient

Object representing a client connection to a storage

strus_StorageTransaction

Object representing a transaction of the storage

strus_VectorStorageSearcher

Object used to search for similar vectors in the collection

strus_VectorStorageClient

Object representing a client connection to a vector storage

strus_VectorStorageTransaction

Object representing a vector storage transaction

strus_DocumentAnalyzer

Analyzer object representing a program for segmenting, tokenizing and normalizing a document into atomic parts, that can be inserted into a storage and be retrieved from there.

strus_QueryAnalyzer

Analyzer object representing a set of function for transforming a field, the smallest unit in any query language, to a set of terms that can be used to build a query.

strus_QueryEval

Query evaluation program object representing an information retrieval scheme for documents in a storage.

strus_Query

Query program object representing a retrieval method for documents in a storage.

.new

Constructor

Parameter

config
(optional) context configuration. If not defined, create context for local mode with own module loader
{rpc="localhost:7181"}
{trace="log=dump;file=stdout"}
{threads=12}

:loadModule

Load a module

Parameter

name
name of the module to load
"analyzer_pattern"
"storage_vector"

Remarks

Only implemented in local mode with own module loader (see constructors)

Examples

loadModule("storage_vector")

:addModulePath

Add one or more paths from where to try to load modules from

Parameter

paths
a string or a list of module search paths
{"/home/bob/modules", "/home/anne/modules"}
"/home/bob/modules"

Remarks

Only implemented in local mode with own module loader (see constructors)

Examples

addModulePath("/home/bob/modules")

:addResourcePath

Add a path where to load analyzer resource files from

Parameter

paths
a string or a list of resource search paths
{"/home/bob/resources", "/home/anne/resources"}
"/home/bob/resources"

Remarks

Only implemented in local mode with own module loader (see constructors)

:createStorageClient

Create a storage client instance

Parameter

config
(optional) configuration (string or structure with named elements) of the storage client or undefined, if the default remote storage of the RPC server is chosen

Examples

createStorageClient()
createStorageClient("path=/srv/searchengine/storage; metadata=doclen UINT32, date UINT32, docweight FLOAT")
createStorageClient({path="/srv/searchengine/storage", metadata="doclen UINT32, date UINT32, docweight FLOAT"})
createStorageClient({ path="/srv/searchengine/storage" metadata="doclen UINT32, date UINT32, docweight FLOAT" max_open_files=256 write_buffer_size="4K" block_size="4M" cache="1G" })

:createVectorStorageClient

Create a vector storage client instance

Parameter

config
(optional) configuration (string or structure with named elements) of the storage client or undefined, if the default remote vector storage of the RPC server is chosen

Examples

createVectorStorageClient()
createVectorStorageClient("path=/srv/searchengine/vecstorage")
createVectorStorageClient({path="/srv/searchengine/vecstorage"})

:createStorage

Create a new storage (physically) described by config

Parameter

config
storage configuration (string or structure with named elements)

Remarks

Fails if the storage already exists

Examples

createStorage("path=/srv/searchengine/storage; metadata=doclen UINT32, date UINT32, docweight FLOAT; acl=yes")
createStorage({path="/srv/searchengine/storage", metadata="doclen UINT32, date UINT32, docweight FLOAT", acl=true})

:createVectorStorage

Create a new storage (physically) described by config

Parameter

config
storage configuration (string or structure with named elements)

Remarks

Fails if the storage already exists

Examples

createVectorStorageClient("path=/srv/searchengine/vecstorage")
createVectorStorageClient({path="/srv/searchengine/vecstorage"})

:destroyStorage

Delete the storage (physically) described by config

Parameter

config
storage configuration (string or structure with named elements)

Remarks

Handle this function carefully

Notes

Works also on vector storages

Examples

destroyStorage("path=/srv/searchengine/storage")
destroyStorage({path="/srv/searchengine/storage"})

:detectDocumentClass

Detect the type of document from its content

Parameter

content
the document content to classify

Examples

detectDocumentClass("<?xml version='1.0' encoding='UTF-8'?><doc>...</doc>")

:createDocumentAnalyzer

Create a document analyzer instance

Parameter

doctype
structure describing the segmenter to use (either document class description structure or segmenter name)
{mimetype="application/xml", encoding="UTF-8", scheme="customer"}
{mimetype="application/json", encoding="UTF-8"}
{segmenter="textwolf"}
"application/json"
"json"

:createQueryAnalyzer

Create a query analyzer instance

Parameter

no parameters defined

:createQueryEval

Create a query evaluation instance

Parameter

no parameters defined

:unpackStatisticBlob

Unpack a statistics blob retrieved from a storage

Parameter

blob
blob with statistics to decode
procname
(optional) name of statistics processor to use for decoding the message (use default processor, if not defined)
"default"
""

:close

Force cleanup to circumvent object pooling mechanisms in an interpreter context

Parameter

no parameters defined

:debug_serialize

Debug method that returns the serialization of the arguments as string

Parameter

arg
structure to serialize as string for visualization (debuging)

Notes

this function is used for verifying if the deserialization of binding language data structures work as expected
The only way to construct a storage client instance is to call Context::createStorageClient(const std::string&)

:nofDocumentsInserted

Get the number of documents inserted into this storage

Parameter

no parameters defined

:documentFrequency

Get the number of inserted documents where a specific feature occurrs in

Parameter

type
the term type of the feature queried
term
the term value of the feature queried

:documentNumber

Get the internal document number from the document identifier

Parameter

docid
document identifier

:documentForwardIndexTerms

Get an interator on the tuples (value,pos) of the forward index of a given type for a document

Parameter

docno
internal local document number
termtype
term type string
pos
(optional) ordinal start position in forward index (where to start iterating)

:documentSearchIndexTerms

Get an interator on the tuples (value,tf,firstpos) of the search index of a given type for a document

Parameter

docno
internal local document number
termtype
term type string

:postings

Get an iterator on the set of postings inserted

Parameter

expression
query term expression
restriction
(optional) meta data restrictions
start_docno
(optional) starting document number

:select

Get an iterator on records of selected elements for matching documents starting from a specified document number

Parameter

what
list of items to select: names of document attributes or meta data or "position" for matching positions or "docno" for the document number
expression
query term expression
restriction
(optional) meta data restrictions
start_docno
(optional) starting document number
accesslist
(optional) list of access restrictions (one of them must match)

:termTypes

Get an iterator on the term types inserted

Parameter

no parameters defined

:docids

Get an iterator on the document identifiers inserted

Parameter

no parameters defined

:docid

Get the document identifier associated with a local document number

Parameter

docno
local document number queried

:usernames

Get an iterator on the user names (roles) used in document access restrictions

Parameter

no parameters defined

:attributeNames

Get the list of inserted document attribute names

Parameter

no parameters defined

:metadataNames

Get the list of inserted document metadata names

Parameter

no parameters defined

:getAllStatistics

Get an iterator on message blobs that all statistics of the storage (e.g. feature occurrencies and number of documents inserted)

Parameter

sign
(optional) true = registration, false = deregistration, if false the sign of all statistics is inverted

Notes

The blobs an be decoded with Context::unpackStatisticBlob

:getChangeStatistics

Get an iterator on message blobs that encode changes in statistics of the storage (e.g. feature occurrencies and number of documents inserted)

Parameter

no parameters defined

Notes

The blobs an be decoded with Context::unpackStatisticBlob

:createTransaction

Create a transaction

Parameter

no parameters defined

:config

Get the configuration of this storage

Parameter

no parameters defined

:configstring

Get the configuration of this storage as string

Parameter

no parameters defined

:close

Close of the storage client

Parameter

no parameters defined

Notes

The only way to construct a storage transaction instance is to call StorageClient::createTransaction()

:insertDocument

Prepare the inserting a document into the storage

Parameter

docid
the identifier of the document to insert
doc
the structure of the document to insert (analyzer::Document)

Notes

The document is physically inserted with the call of 'commit()'

:deleteDocument

Prepare the deletion of a document from the storage

Parameter

docid
the identifier of the document to delete

Notes

The document is physically deleted with the call of 'commit()'

:deleteUserAccessRights

Prepare the deletion of all document access rights of a user

Parameter

username
the name of the user to delete all access rights (in the local collection)

Notes

The user access rights are changed accordingly with the next implicit or explicit call of 'flush'

:commit

Commit all insert or delete or user access right change statements of this transaction.

Parameter

no parameters defined

:rollback

Rollback all insert or delete or user access right change statements of this transaction.

Parameter

no parameters defined

:findSimilar

Find the most similar vectors to vector

Parameter

vec
vector to search for (double[])
maxNofResults
maximum number of results to return

:findSimilarFromSelection

Find the most similar vectors to vector in a selection of features addressed by index

Parameter

featidxlist
list of candidate indices (int[])
vec
vector to search for (double[])
maxNofResults
maximum number of results to return

:close

Controlled close to free resources (forcing free resources in interpreter context with garbage collector)

Parameter

no parameters defined

Notes

The only way to construct a vector storage client instance is to call Context::createVectorStorageClient(const std::string&)

:createSearcher

Create a searcher object for scanning the vectors for similarity

Parameter

range_from
start range of the features for the searcher (possibility to split into multiple searcher instances)
range_to
end of range of the features for the searcher (possibility to split into multiple searcher instances)

:createTransaction

Create a vector storage transaction instance

Parameter

no parameters defined

:conceptClassNames

Get the list of concept class names defined

Parameter

no parameters defined

:conceptFeatures

Get the list of indices of features represented by a learnt concept feature

Parameter

conceptClass
name identifying a class of concepts learnt
conceptid
index (indices of learnt concepts starting from 1)

:nofConcepts

Get the number of concept features learnt for a class

Parameter

conceptClass
name identifying a class of concepts learnt.

:featureConcepts

Get the set of learnt concepts of a class for a feature defined

Parameter

conceptClass
name identifying a class of concepts learnt
index
index of vector in the order of insertion starting from 0

:featureVector

Get the vector assigned to a feature addressed by index

Parameter

index
index of the feature (starting from 0)

:featureName

Get the name of a feature by its index starting from 0

Parameter

index
index of the feature (starting from 0)

:featureIndex

Get the index starting from 0 of a feature by its name

Parameter

name
name of the feature

:nofFeatures

Get the number of feature vectors defined

Parameter

no parameters defined

:config

Get the configuration of this vector storage

Parameter

no parameters defined

:configstring

Get the configuration of this vector storage as string

Parameter

no parameters defined

:close

Controlled close to free resources (forcing free resources in interpreter context with garbage collector)

Parameter

no parameters defined

:addFeature

Add named feature to vector storage

Parameter

name
unique name of the feature added
vec
vector assigned to the feature

:defineFeatureConceptRelation

Assign a concept (index) to a feature referenced by index

Parameter

conceptClass
name of the relation
featidx
index of the feature
conidx
index of the concept

:commit

Commit of the transaction

Parameter

no parameters defined

:rollback

Rollback of the transaction

Parameter

no parameters defined

:close

Controlled close to free resources (forcing free resources in interpreter context with garbage collector)

Parameter

no parameters defined

Remarks

The only way to construct a document analyzer instance is to call createDocumentAnalyzer of the Context interface

:addSearchIndexFeature

Define a feature to insert into the inverted index (search index) is selected, tokenized and normalized

Parameter

type
type of the features produced (your choice)
"word"
selectexpr
expression selecting the elements to fetch for producing this feature
"/doc/text//()"
"/doc/user@id"
"/doc/text[@lang='en']//()"
tokenizer
tokenizer function description to use for this feature
"split"
{"regex", "[0-9]+"}
normalizers
list of normalizer function descriptions to use for this feature in the ascending order of appearance
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}
options
(optional) a list of option strings, one of {"content" => feature has own position, "unique" => feature gets position but sequences or "unique" features without "content" features in between are mapped to one position, "pred" => the position is bound to the preceeding feature, "succ" => the position is bound to the succeeding feature}
"content"
"unique"
"succ"
"pred"

Examples

addSearchIndexFeature("word", "/doc/elem", "word", {"lc", {"stem", "en"}})

:addForwardIndexFeature

Define a feature to insert into the forward index (for summarization) is selected, tokenized and normalized

Parameter

type
type of the features produced
"word"
selectexpr
expression selecting the elements to fetch for producing this feature
"/doc/text//()"
"/doc/user@id"
"/doc/text[@lang='en']//()"
tokenizer
tokenizer function description to use for this feature
"split"
{"regex", "[0-9]+"}
normalizers
list of normalizer function descriptions to use for this feature in the ascending order of appearance
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}
options
(optional) a list of options, one of {"content" => feature has own position, "unique" => feature gets position but sequences or "unique" features without "content" features in between are mapped to one position, "pred" => the position is bound to the preceeding feature, "succ" => the position is bound to the succeeding feature}
"content"
"unique"
"succ"
"pred"

:addPatternLexem

Declare an element to be used as lexem by post processing pattern matching but not put into the result of document analysis

Parameter

type
term type name of the lexem to be feed to the pattern matching
"word"
selectexpr
an expression that decribes what elements are taken from a document for this feature (tag selection in abbreviated syntax of XPath)
"/doc/text//()"
"/doc/user@id"
"/doc/text[@lang='en']//()"
tokenizer
tokenizer (ownership passed to this) to use for this feature
"split"
{"regex", "[0-9]+"}
normalizers
list of normalizers (element ownership passed to this) to use for this feature
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}

:defineMetaData

Define a feature to insert as meta data is selected, tokenized and normalized

Parameter

fieldname
name of the addressed meta data field.
"date"
selectexpr
expression selecting the elements to fetch for producing this feature
"/doc/text//()"
"/doc/user@id"
"/doc/text[@lang='en']//()"
tokenizer
tokenizer function description to use for this feature
"split"
{"regex", "[0-9]+"}
normalizers
list of normalizer function descriptions to use for this feature in the ascending order of appearance
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}

:defineAggregatedMetaData

Declare some aggregated value of the document to be put into the meta data table used for restrictions, weighting and summarization.

Parameter

fieldname
name of the addressed meta data field.
"doclen"
function
defining how and from what the value is aggregated
{"count", "word"}

:defineAttribute

Define a feature to insert as document attribute (for summarization) is selected, tokenized and normalized

Parameter

attribname
name of the addressed attribute.
"docid", "title"
selectexpr
expression selecting the elements to fetch for producing this feature
"/doc/text//()"
"/doc/user@id"
"/doc/text[@lang='en']//()"
tokenizer
tokenizer function description to use for this feature
"split"
{"regex", "[0-9]+"}
normalizers
list of normalizer function descriptions to use for this feature in the ascending order of appearance
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}

:addSearchIndexFeatureFromPatternMatch

Define a result of pattern matching as feature to insert into the search index, normalized

Parameter

type
type name of the feature to produce.
"concept"
patternTypeName
name of the pattern to select
"word"
normalizers
list of normalizer function descriptions to use for this feature in the ascending order of appearance
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}
options
(optional) a list of option strings, one of {"content" => feature has own position, "unique" => feature gets position but sequences or "unique" features without "content" features in between are mapped to one position, "pred" => the position is bound to the preceeding feature, "succ" => the position is bound to the succeeding feature}
"content"
"unique"
"succ"
"pred"

:addForwardIndexFeatureFromPatternMatch

Define a result of pattern matching as feature to insert into the forward index, normalized

Parameter

type
type name of the feature to produce.
"concept"
patternTypeName
name of the pattern to select
"word"
normalizers
list of normalizer function descriptions to use for this feature in the ascending order of appearance
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}
options
(optional) a list of options, elements one of {"BindPosPred" => the position is bound to the preceeding feature, "BindPosSucc" => the position is bound to the succeeding feature}
"content"
"unique"
"succ"
"pred"

:defineMetaDataFromPatternMatch

Define a result of pattern matching to insert as metadata, normalized

Parameter

fieldname
field name of the meta data element to produce.
"location"
patternTypeName
name of the pattern to select
"word"
normalizers
list of normalizer function descriptions to use for this feature in the ascending order of appearance
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}

:defineAttributeFromPatternMatch

Define a result of pattern matching to insert as document attribute, normalized

Parameter

attribname
name of the document attribute to produce.
"annotation"
patternTypeName
name of the pattern to select
"word"
normalizers
list of normalizer function descriptions to use for this feature in the ascending order of appearance
"uc"
{"lc", {"convdia", "en"}}
{"date2int", "d", "%Y-%m-%d"}

:definePatternMatcherPostProc

Declare a pattern matcher on the document features after other query analysis

Parameter

patternTypeName
name of the type to assign to the pattern matching results
"location"
patternMatcherModule
module id of pattern matcher to use (empty string for default)
""
lexems
list of all lexems generated by the feeder (analyzer)
"word"
{"word", "number"}
patterns
structure with all patterns

:definePatternMatcherPostProcFromFile

Declare a pattern matcher on the document features after other query analysis

Parameter

patternTypeName
name of the type to assign to the pattern matching results
"location"
patternMatcherModule
module id of pattern matcher to use (empty string for default)
""
serializedPatternFile
path to file with serialized (binary) patterns
"/srv/strus/patterns.bin"

:defineDocument

Declare a sub document for the handling of multi part documents in an analyzed content or documents of different types with one configuration

Parameter

subDocumentTypeName
type name assinged to this sub document
"employee"
selectexpr
an expression that defines the content of the sub document declared
"/doc/employee"

Notes

Sub documents are defined as the sections selected by the expression plus some data selected not belonging to any sub document.

:analyzeSingle

Analye a content and return the analyzed document structure (analyzing single document)

Parameter

content
content string (NOT a file name !) of the document to analyze
"<?xml version='1.0' encoding='UTF-8' standalone=yes?><doc>...</doc>"
documentClass
(optional) document class of the document to analyze, if not specified the document class is guessed from the content with document class detection
{mimetype="application/xml", encoding="UTF-8", scheme="customer"}
{mimetype="application/json", encoding="UTF-8"}

:analyzeMultiPart

Analye a content and return the analyzed document structures as iterator (analyzing multipart document)

Parameter

content
content string (NOT a file name !) with the documents to analyze
"<?xml version='1.0' encoding='UTF-8' standalone=yes?><doc>...</doc>"
documentClass
(optional) document class of the document set to analyze, if not specified the document class is guessed from the content with document class detection
{mimetype="application/xml", encoding="UTF-8", scheme="customer"}
{mimetype="application/json", encoding="UTF-8"}

Remarks

The only way to construct a query analyzer instance is to call createQueryAnalyzer of the Context interface

Notes

If you are not sure if to use analyzeSingle or analyzeMultiPart, then take analyzeMultiPart, because it covers analyzeSingle, returning an iterator on a set containing the single document only

:addElement

Defines an element (term, metadata) of query analysis.

Parameter

featureType
element feature type created from this field type
fieldType
name of the field type defined
tokenizer
tokenizer function description to use for the features of this field type
normalizers
list of normalizer function descriptions to use for the features of this field type in the ascending order of appearance

:addElementFromPatternMatch

Defines an element from a pattern matching result.

Parameter

type
element type created from this pattern match result type
patternTypeName
name of the pattern match result item
normalizers
list of normalizer functions

:addPatternLexem

Declare an element to be used as lexem by post processing pattern matching but not put into the result of query analysis

Parameter

termtype
term type name of the lexem to be feed to the pattern matching
fieldtype
type of the field of this element in the query
tokenizer
tokenizer function description to use for the features of this field type
normalizers
list of normalizer function descriptions to use for the features of this field type in the ascending order of appearance

:definePatternMatcherPostProc

Declare a pattern matcher on the query features after other query analysis

Parameter

patternTypeName
name of the type to assign to the pattern matching results
patternMatcherModule
module id of pattern matcher to use (empty string for default)
lexems
list of all lexems generated by the feeder (analyzer)
patterns
structure with all patterns

:definePatternMatcherPostProcFromFile

Declare a pattern matcher on the query features after other query analysis

Parameter

patternTypeName
name of the type to assign to the pattern matching results
patternMatcherModule
module id of pattern matcher to use (empty string for default)
serializedPatternFile
path to file with serialized (binary) patterns

:defineImplicitGroupBy

Declare an implicit grouping operation for a query field type. The implicit group operation is always applied when more than one term are resulting from analysis of this field to ensure that you get only one node in the query from it.

Parameter

fieldtype
name of the field type where this grouping operation applies
opname
query operator name generated as node for grouping
range
positional range attribute for the node used for grouping
cardinality
cardinality attribute for the node used for grouping
groupBy
kind of selection of the arguments grouped ("position": elements with same position get their own group, "all" (or "" default): all elements of the field get into one group

:analyzeTermExpression

Analye the term expression and return the result structure

Parameter

expression
query term expression tree

:analyzeMetaDataExpression

Analye the metadata expression and return the result structure

Parameter

expression
query metadata expression tree

:addTerm

Declare a term that is used in the query evaluation as structural element without beeing part of the query (for example punctuation used for match fields summarization)

Parameter

set
identifier of the term set that is used to address the terms
type
feature type of the of the term
value
feature value of the of the term

:addSelectionFeature

Declare a feature set to be used as selecting feature

Parameter

set
identifier of the term set addressing the terms to use for selection

:addRestrictionFeature

Declare a feature set to be used as restriction

Parameter

set
identifier of the term set addressing the terms to use as restriction

:addExclusionFeature

Declare a feature set to be used as exclusion

Parameter

set
identifier of the term set addressing the terms to use as exclusion

:addSummarizer

Declare a summarizer

Parameter

name
the name of the summarizer to add
parameter
the parameters of the summarizer to add (parameter name 'debug' reserved for declaring the debug info attribute)
resultnames
(optional) the mapping of result names

:addWeightingFunction

Add a weighting function to use as summand of the total document weight

Parameter

name
the name of the weighting function to add
parameter
the parameters of the weighting function to add

:defineWeightingFormula

Define the weighting formula to use for calculating the total weight from the weighting function results (sum of the weighting function results is the default)

Parameter

source
of the weighting formula
defaultParameter
(optional) default parameter values

:createQuery

Create a query to instantiate based on this query evaluation scheme

Parameter

storage
storage to execute the query on

:addFeature

Create a feature from the query expression passed

Parameter

set
name of the feature set, this feature is addressed with
expr
query expression that defines the postings of the feature and the variables attached
weight
(optional) individual weight of the feature in the query

Remarks

expr
The query expression passed as parameter is refused if it does not contain exactly one element

:addMetaDataRestriction

Define a meta data restriction

Parameter

expression
meta data expression tree interpreted as CNF (conjunctive normalform "AND" of "OR"s)

Notes

expression
leafs of the expression tree are 3-tuples of the form {operator,name,operand} with operator: one of "=","!=",">=","<=","<",">" name: name of meta data element value: numeric value to compare with the meta data field (right side of comparison operator) if the tree has depth 1 (single node), then it is interpreted as single condition if the tree has depth 2 (list of nodes), then it is interpreted as intersection "AND" of its leafs an "OR" of conditions without "AND" is therefore expressed as list of list of structures, e.g. '[[["<=","date","1.1.1970"], [">","weight",1.0]]]' <=> 'date <= "1.1.1970" OR weight > 1.0' and '[["<=","date","1.1.1970"], [">","weight",1.0]]' <=> 'date <= "1.1.1970" AND weight > 1.0'

:defineTermStatistics

Define term statistics to use for a term for weighting it in this query

Parameter

type
query term type name
value
query term value
stats
the structure with the statistics to set
{df=74653}

Examples

defineTermStatistics("word", "game", {df=74653})

:defineGlobalStatistics

Define the global statistics to use for weighting in this query

Parameter

stats
the structure with the statistics to set
{nofdocs=1234331}

:addDocumentEvaluationSet

Define a set of documents the query is evaluated on. By default the query is evaluated on all documents in the storage

Parameter

docnolist
list of documents to evaluate the query on (std::vector<int>)

:setMaxNofRanks

Set number of ranks to evaluate starting with the first rank (the maximum size of the result rank list)

Parameter

maxNofRanks
maximum number of results to return by this query

:setMinRank

Set the index of the first rank to be returned

Parameter

minRank
index of the first rank to be returned by this query

:addAccess

Allow read access to documents having a specific ACL tag

Parameter

userlist
Add ACL tag or list of ACL tags that selects documents to be candidates of the result

Notes

If no ACL tags are specified, then all documents are potential candidates for the result

:setWeightingVariables

Assign values to variables of the weighting formula

Parameter

parameter
parameter values (std::map<std::string,double>)

:setDebugMode

Switch on debug mode that creates debug info of query evaluation methods and summarization as attributes of the query result

Parameter

debug
true if switched on, false if switched off (default off)

Notes

Debug attributes are specified in the declaration of summarizers and weighting functions (3rd parameter of QueryEval::addSummarizer and QueryEval::addWeightingFunction)

:evaluate

Evaluate this query and return the result

Parameter

no parameters defined

:tostring

Map the contents of the query to a readable string

Parameter

no parameters defined
generated by papugaDoc (Strus 0.14.0)