Configuring Screener Environment

You need to set the environment before using the Screener.

This section guides you through the steps required for installation of Screener.

Installing Spectrum™ Technology Platform

Install Spectrum™ Technology Platform with these modules:

  • Spectrum Technology Platform Server(64-bit)
  • Business Steward Module
  • Data Integration Module
  • Data Hub Module
  • Spectrum Screener
  • Universal Addressing Module
  • Universal Name, Data Normalization, and Advanced Matching Modules
    Note: Deselect the Start the server after installation is complete check box which appears in the installation wizard.

    For more information on installing Spectrum™ Technology Platform, see Installing a New Server section of the Installation Guide.

Follow these steps after successful installation of Spectrum™ Technology Platform:
  1. Navigate to <SpectrumLocation>\server\modules\fcc\model and extract the contents of the zip folder model.FCC_METADATA to a folder with the same name.
    Note: Ensure that the extracted folder consists of the same data as that of the zip folder.
  2. Copy the model.FCC_METADATA folder to <SpectrumLocation>\server\modules\hub\db
  3. Import dataflows. For more information see, Importing Flows.
  4. Start the Spectrum™ Technology Platform server.
After the Spectrum™ Technology Platform server starts successfully, follow these steps to generate secure Entities of model.FCC_METADATA in Hub:
  1. To access the Spectrum Hub RAC Client, open a web browser and go to:

    http://Server:Port/hub

    Where server is the server name or IP address of your Spectrum™ Technology Platform server and port is the HTTP port. By default, it is set to 8080.

  2. Login using admin as the User name and Password
  3. Click the Pitney Bowes logo placed at the top left corner of the screen and click Manage from the available menu options.

    Model Management pop-up window appears, this displays details about FCC_METADATA under the Models tab.

  4. Copy FCC_METADATA model using the Copy button and rename it to FCC_METADATA_BAK
  5. Delete FCC_METADATA model using the Remove button.
  6. Copy FCC_METADATA_BAK model using the Copy button and rename it to FCC_METADATA
  7. Delete FCC_METADATA_BAK model using the Remove button.

Importing Base Tables

Run the Data Normalization Module database utility, select Advanced Transformer and enter this path for the source folder: <SpectrumLocation>/server/modules/fcc/FCC_Repo/setup/baseTables
Note: Repeat this step by selecting Open Parser and Table Lookup.

Importing CustomTable

Import CustomTables through the Spectrum™ Administration Utility. Use this command:
table import <spectrumlocation/server/modules/fcc/FCC_Repo/setup> 
Here, the path of table will be for all the tables present under these folders:
  • at
  • parser
  • lookup
For more information on the Administration utility, see the Administration Utility section of the Administration Guide. Getting Started with the Administration Utility

Importing matchrule and openparser domain

Import matchrules and openparser domain through the Spectrum™ Administration Utility. Use these commands:

  • For matchrule:
matchrule import --f  <spectrumlocation/server/modules/fcc/FCC_Repo/setup/matchRule>
  • For openparser domain:
openparser domain import --f  <spectrumlocation/server/modules/fcc/FCC_Repo/setup/domainOP>

Importing Flows

Data flow process data from one stage to the other. Output of one flow becomes the input of the next, this is not true always. There are process flows which use main flows for processing.

  • FCC-Integrated-Flow: An end-to-end integrated flow. It cleanses, screens and creates an alert for the data if any hit is found. This flow contains:
    • ProcFlow_PartyER_N_Screening
Process flows and their main data flows of various modules are summarized in these sections:
  • Party Management Data/Process: Run the FCC_ER_Party_PostProc1_CreatePartySearchIndex_v1 job before running any other flow. The input file for this job is DummyPartySearchIndex.txt and the outputs generated are:
    • To Index: ER_Party_SearchIndex
    • To File: SI_Update_All_Test.csv

Party Management flows:

Process Flow Main Flow Input Output
FCC_ER_Full LoadInitial_ProcessFlow_V1: This flow takes party data as input and performs cleansing, normalization, and intraflow matching before producing the output. FCC_ER_Party_ MainFlow1_Normalization _v1.df: For normalization of data. Takes a file with party data. Fields include:
  • inParty
  • inPartyAddress
  • inPartyPhone
  • inPartyEmail
  • inAccount
  • inPartyAccount
It produces: Party_MainFlow2_1_input.txt
FCC_ER_Party_ MainFlow2_1_ IntraflowMatch_v1.df: Finds matches between similar data records. Party_MainFlow2_1_input.txt’ which is the output of FCC_ER_Party_ MainFlow1_ Normalization_v1.df It produces:

Party_MainFlow2_2_input.txt

FCC_ER_Party_ MainFlow2_2_ TransactionalMatch AndSurvivorship_v1.df: Finds interflow matches. Party_MainFlow2_2_input.txt which is the output of FCC_ER_Party_ MainFlow2_1_ IntraflowMatch_v1.df It produces:

Party_MainFlow3_input.txt

FCC_ER_Party _PostProc1_Update PartySearchIndex_v1: Updates a party search index based on Party ID. Party_MainFlow3_input.txt which is output of FCC_ER_Party_ MainFlow2_2_Transactional MatchAndSurvivorship_v1.df It produces:

To Index:

ER_Party_SearchIndex
Screening Data/Process Flows: This table summarizes the screening flows:
Process Flow Flow Input Output
Screening_Process: Performs screening and loads data into hub. Party_Screening: It performs party screening and stores results in a file. This will screen the parties against both cleansed and uncleansed List data. Party_MainFlow3_input.txt

This is the output of: FCC_ER_Party _MainFlow2_2_ TransactionalMatchAnd Survivorship_v1.df

It produces these:
  • To File: Party_Cleansed _Hits.csv
  • To Uncleansed List: Party_ UnCleansed _Hits.csv
Screening_Output: Combines the results of cleansed and uncleansed matches and performs consolidation. This essentially means creating an alert in the Hub model and consolidating all the matches in a single alert. It takes cleansed data file Party_Cleansed_Hits.csv and uncleansed data file Party_UnCleansed_Hits.csv. These are output of: Party_Screneing.df Loads the data into hub.
Note: Import all other flows placed at <spectrumlocation>/server/modules/fcc/Dataflows

Mapping DB for UAM and Loqate

Map the Universal Addressing Module and Loqate databases with the Screener. This will validate and standardize the addresses. For information on Spectrum™ databases, see the Spectrum Databases section of the Administration Guide, Introduction to Spectrum Databases.

Add these as the name of the databases:
  • Loqate
  • LQT_EUROPE
  • LQT_EMEA
  • LQT_APAC
  • LQT_AMER
  • UAMUS
  • UAMCAN
  • UAMFRA

Placement of Data Files

Copy the FCC_Repo folder placed at SpectrumLocation/server/modules/fcc from the installation folder to the C: drive of your system. If you are not using the C: drive, change the paths of input and output files for the flows mentioned in the Screening Data/Process flow table. For more information, See Screening Data/Process Flows.

Sample input and output files are present in the FCC_Repo folder.

Configuring Negative Media Service

Importing the Negative Media Service
  1. Open the Management Console
  2. Click Resources
  3. Click External Web Services
  4. Click Import and select NegativeMediaNomino service
Authenticating Negative Media Service
  1. Generate an open token. For details on generating the open token, click here.
  2. Edit the NegativeMediaNomino service
  3. Click Next twice to reach Headers section
  4. Update the Authorization token with the token generated in step 1
  5. Click Save
    Note: Ensure that the NegativeMediaNominoservice is enabled.

List Import utility – Upload and Polling

To import any List files from the Screener or to directly upload these to the polling or base directory, you need to configure and start ListIngestion utility.

Configuring a Pre Process Flow

Create a pre process flow prior to uploading any List files from the vendor. If not creating a pre process flow you must ensure that the input files are in a standard canonical format which is fixed for list ingestion flows.
These flows correspond to a unique combination of ListType and Vendor. Multiple files can be provided as input and the output should be generated in a single file, this output file should be in a fixed canonical format. Number and order of columns of the output file must match the input file of the list ingestion process flow.
Note: The output file must not contain ListID and ListEntryStatus fields.
If no Pre Process flow is configured for a given ListType and Vendor, it will be treated as a fixed canonical format file and will be directly fed to the List Ingestion Process flow.
Note: After creating the pre process flow, configure it in hub using the SetFlowConfig flow.
All the fields such as ListType, Vendor, Mappings, and FlowName of the Inspection Input tab are mandatory.
The Mappings section displays a list of files and their Mode, Info and Stage. The Label field gets generated in the flow.
  • The Mapping Mode can be IN or OUT, specify IN or OUT depending on whether you want to Read from File or Write to File.
  • The Info field specifies the information about the file such as address, e-mail, and name. This is helpful during uploading a file.
After the successful configuration of the pre process flow, run the List Ingestion Utility.

Running the List Ingestion Utility

To run the list ingestion utility, follow these steps:

This utility requires Spectrum™ Technology Platform libraries for execution.
  1. Navigate to this path:

    <SpecrumLocation> \server\modules\fcc\ListIngestionUtiltity

  2. Execute this script in the command line: java -cp "fcc.jar;C:/<Path of Spectrum Libraries>" com.pb.spectrum.fcc.job.Main -u <Spectrum Username> -p <Spectrum Password> -h <Spectrum Server> -s <Port> -x <Path of List Polling Directory>
    Note: <Path of List Polling Directory> must match the fcc.spectrum.list.job.poll.dir property file placed at modules/fcc/fcc.properties
The Log file for this utility fcc.log will be generated at <SpectrumLocation> \server\modules\fcc\ListIngestionUtiltity. This utility will be polling at the base directory for any changes made by uploading List files or by directly copying these to Import folder.

Any List file uploaded through the pre process flow will be placed in the base directory with this hierarchy:

BaseDir > ListType > Vendor > ListID > Import > InputFile

Note: This execution can also be triggered directly by placing the input file in Import directory
After the completion of this utility, List_in.csv file placed in the Import directory will be moved to NewListVersion/archive Listversion directory, irrespective of the status (passed or failed) of the job.