While provisioning Database-as-a-Service (DBaaS) on an Oracle Cloud Infrastructure (OCI) platform, a common error that restricts provisioners from proceeding further is the failure to connect to Oracle Cloud Storage.  Here is what usually happens:

  • In the next step, the user provides details for the provisioning wizard – values such as storage container details, credentials, etc., as shown in the screenshot below.
  • Clicking Next then results in the following error: “Could not connect to Oracle Storage Cloud Service using the given username and password”.  Hence, the user is unable to proceed.

Solution

Although the username and password are correct, the error occurs repeatedly for more than one user in the group.  The issue is that these users are not part of the ‘Administrators’ group of the Storage/Object Container; hence, access is restricted.

Follow these steps to add the user to the Administrators group for the Storage Container:

  • Access the OCI Console.  Select the Menu bar > Identity > Users.
  • Choose the specific user and select ‘View User Details’.
  • Choose ‘Groups’ to view the Group Details. Here, you can see that the User is not a Member of any groups.
  • Select ‘Add User to Group’, then choose ‘Administrators’ from the drop-down menu and select ‘Add’.

Note: The storage container password is a pre-generated authorization token on the OCI console.  It is only displayed once – at the time of creation.

To watch a video with instructions on the creation of DBaaS using OCI, click here.

As always, if you have any questions related to this post, please leave us a comment below and one of our experts will get back to you.

Thanks for visiting!

Tagged with: , , , , , ,

When more than one Salesforce connection is used in the same integration flow, Oracle Integration Cloud (OIC) will throw the following integration activation error:

oracle.xml.parser.schema.XSDException: Can not build schema ‘urn:enterprise.soap.sforce.com’ located at ‘urn:enterprise.soap.sforce.com.__OAUX_GENXSD_.TOP.XSD’ [Cause=Can not build schema ‘urn:enterprise.soap.sforce.com’ located at ‘urn:enterprise.soap.sforce.com.__OAUX_GENXSD_.TOP.XSD’]”

Use Case

When an Incident is created in Oracle Service Cloud, depending on the Service Category selected, the Incident is sent to one of the two Salesforce destination systems.

Integration Flow:

Below is the OIC interface for the above use case, which is using two SFDC connections (invoke).

Integration Activation Error:

When the OIC interface with two SFDC connections is activated, the following error is received:

Error in the log file:

The process domain is encountering the following errors while loading the process “DEMO_TWO_TARGET_SFDC_ERROR” (composite “default/DEMO_TWO_TARGET_SFDC_ERROR!01.00.0000*soa_3fc16407-f5b1-4c6e-9fb7-b28be9d4758d”): BPEL 1.1 compilation failed: DEMO_TWO_TARGET_SFDC_ERROR.bpel(line 149): part “parameters” of variable “ics_api_internal_variable2” is defined as XML element “{http://xmlns.oracle.com/cloud/adapter/salesforce/CreateServiceRequest_REQUEST}create” whose definition cannot be resolved because “oracle.xml.parser.schema.XSDException: Can not build schema ‘urn:enterprise.soap.sforce.com’ located at ‘urn:enterprise.soap.sforce.com.__OAUX_GENXSD_.TOP.XSD’ [Cause=Can not build schema ‘urn:enterprise.soap.sforce.com’ located at ‘urn:enterprise.soap.sforce.com.__OAUX_GENXSD_.TOP.XSD’]”.

Solution:

To resolve this error, split the integration into two flows – a main flow and a sub flow.

  1. The main flow receives the Incident record from Oracle Service Cloud.
  2. If Service = “Transportation”, the Incident data is mapped to the Service Request object of Salesforce 1.
  3. The Salesforce web service is invoked using the Salesforce 1 SOAP connection to create a Service Request in Salesforce 1; otherwise, the data is mapped to the input parameter of the sub flow and invokes the sub flow.
  4. The sub flow invokes the Salesforce web service using the Salesforce 2 SOAP connection to create a Hotline Concern in Salesforce 2. 

As always, if you have any questions, please leave us a comment here and one of our experts will get back to you.

Thanks for stopping by!

Tagged with: , , , , ,

chatbots2Oracle Mobile Cloud Enterprise is a unique industry offering in that it combines both cross platform mobile development and chatbots into a single comprehensive offering.  Developers can create a chatbot in as little as a few hours and integrate it with various messaging platforms or mobile apps.

Our team recently attended and successfully completed Oracle’s Online Training on Chatbots, learning the ropes of how to build a chatbot and how to best model use cases for maximum customer engagement.  It was exciting to explore the use cases and we hope to showcase some of these in the near future.  In the meantime, you can view this excellent video series on how to model and build your own chatbots!

AST’s Integration team members who attended the Oracle Chatbot training earned completion badges after successfully finishing the training exam.  Congratulations to Sanjay B., Ankit C., Ankesh A., Jignesh P., and Shirish K.!

1

If you’re lost on where to start with chatbots or wondering how they can help you better engage with customers, let us know!  We can help you!

Tagged with: , , , , , , ,

Recently, we discovered a version mismatch issue in which two SOAP services on different versions (Service-A on SOAP 1.2 and Service-B on SOAP 1.1) were failing to communicate with each other. Here, we explain how to resolve this issue.

Difference between SOAP 1.1 and SOAP 1.2

Before diving into the solution, let’s look at the improvements SOAP Version 1.2 provides over SOAP 1.1.

  1. Offers a clear processing model;
  2. Provides improved interoperability with testing and implementation requirements;
  3. Based on XML Information Set (i.e. It is specified as an Infoset, which is carried from one SOAP node to another, whereas SOAP 1.1 was based on XML 1.0 serialization.);
  4. Offers protocol independence for developers by providing a binding framework;
  5. Includes HTTP binding for improved integration to the World Wide Web;
  6. Delivers a well-defined extensibility model; and
  7. Provides improved support for Web standards.

WSDL changes observed in SOAP 1.2

  1. Namespace Changes: SOAP 1.2 supports the following namespace definition:
xmlns:soap12="http://www.w3.org/2003/05/soap-envelope"

2.  SOAP 1.2 uses “application/soap+xml” as the Content Type, whereas SOAP 1.1 uses “text/xml”.

3.  SOAP:Operation and SOAP Binding must be specified in SOAP 1.2 WSDL.

Solution

Now, to overcome the versioning mismatch issue mentioned above, we must follow these steps:

  1. Generate the OSB Proxy as a Message Based Proxy service. This will be based on the XSD with only the “body” part with required parameters to call Service-A (SOAP 1.2) and Service-B (SOAP 1.1).
  2. Create a Pipeline Service based on the same methodology explained in number 1, above.
  3. In the Pipeline Service, navigate to Message Flow and add a Pipeline Pair – rename it per the process standards.
  4. In the Request Pipeline node, add a Stage and rename it per process standards.
  5. Inside the Stage, add a Service Callout, then browse for the Proxy Service for the wrapper of Service-A or the business service of Service-A. The Service Callout configuration is shown below with the required message to Service-A parameters assigned. 1
  6. After the above Pipeline Pair is complete, add a Route Node.
  7. Inside the Route Node, add a Routing Operation and configure the same for the Business Service of Service-B.
  8. Inside the Request Actions, assign or replace the Body and Header to make a successful call for Business Service. The following snapshot illustrates this.  2

As always, if you have any questions regarding this issue or solution, leave us a comment below or contact us at info@astcorporation.com and our team will get back to you.

Tagged with: , , , , ,

A familiar constraint we encounter when using Oracle Service Bus (OSB) Business Services is that they are required to be configured with the Username Token policy, where the password should not be text-based; rather, it should be in binary format with Nonce and Creation Time. The password should also use the same Password Digest – SHA Base-64 encoded format – for enhanced security implementation.

This type of security policy is not provided out of the box by Oracle Web Services Manager (OWSM).

This post will explain how to create the correct policy in a few quick steps.

To overcome the above security configuration issue, you will need to:

  1. Configure the WebLogic Server Security Realm providers to use Active types as the Password Digest.
  2. Configure a custom OWSM Security Policy to accept the password type as Password Digest.

Below are the detailed steps for the two high-level processes outlined above.

  1. Configure the WebLogic Server Security Realm providers to use Active types as Password Digest.
    1. Log in to WebLogic Console, and go to Summary of Security Realms > myrealm > Providers.  1
    2. Create a new Provider (e.g. Custom GESB Provider) and select Type: DefaultAuthenticator.

2                 c. Open the newly created provider, go to Configuration > Provider Specific tab, select the Enable Password Digest checkbox, and save changes.

3

d. Go to DefaultIdentityAsserter and Select wsse:PasswordDigest in Active Types.

4                      e. Restart the WebLogic Server.

  1. Configure out-of-the-box WSM Security Policy for Password Digest.
    1. Log in to EM Console and go to WebLogic Domain > Web Services > WSM Policies
    2. Click on “oracle/wss_username_token_client_policy”, then click Create Like button

5                    c. In the General tab, provide the name of the policy, leaving the other settings as is.

6                     d. In the Assertions tab, select Digest as the Password Type, then select the Nonce Required and Creation Time Required check boxes, and click Save.

7

  1. The Policy you configured should be ready now.

8

  1. Restart the server.

The newly created OWSM policy can now be attached to your chosen process to provide CSF Key credentials. The policy will automatically create binary passwords and other required parameters.

Tagged with: , , , , ,

As part of best practices, a deployer responsible for deploying composites should not need to have the roles a privileges of an administrator; instead, they should be limited to deploying composites.

Selecting the proper roles and privileges to grant deployment rights to this user is slightly confusing and involves changes to the user’s Oracle WebLogic Server enterprise role, as well as their Oracle SOA Suite application role. The following security exception will occur if and when the account used for deployments through JDeveloper lacks the appropriate roles and privileges to complete a deployment:

“Error finding SOA configured servers to deploy archive.
Deployment cannot continue.
Java.lang.SecurityException: MBean attribute access denied.
     MBean: EMDomain:Name=soa-infra,
EMTargetType=oracle_soainfra,type=EMIntegration,
Application=soa-infra
     Getter for attribute Server
     Detail: Access denied. Required roles: Admin, Operator, Monitor, Deployer, executing subject:   principals[testuser]”

Middleware 4272017

This error occurs because there is no default mapping of roles between Oracle WebLogic Server groups or users and Oracle Enterprise Manager Fusion Middleware Control.

Both the Oracle WebLogic Server enterprise role (for example, Oracle WebLogic Server Monitor) and the Oracle SOA Suite application role (for example, SOAMonitor) are required to use Oracle Enterprise Manager Fusion Middleware Control. If you have only one of these roles, Oracle Enterprise Manager Fusion Middleware Control does not work properly.

Solution:

The fix is simple and requires assigning the required role in WebLogic Security Realm:

  1. Login to WebLogic console as WebLogic or as any user with administrative privileges.
  2. Click on “Security Realm” and select “myrealm”.
  3. Select “Users and Groups” tab.
  4. Select the user which requires access and navigate to “Groups” tab.
  5. Assign the user to “Operators” group.

Assigning the required role in Enterprise Manager:

  1. Login to WebLogic console as WebLogic or as any user with administrative privileges.
  2. Right click on soa-infra and select “Security” -> “Application Roles
  3. Add the same user to “SOAOperator” role.
  4. Navigate to Application Policies (Right click on soa-infra -> “Security” -> “Application Policies”) and assign “oracle.fabric.permission.CompositePermission” to “SOAOperator” role.

The user will now have necessary privileges for deploying composites through Jdeveloper.

Tagged with: , , ,

What are CSF Key Credentials?

A credential store is a repository of security data (credentials). A credential can hold username and password combinations, tickets, or public key certificates.

Credential Store Framework (CSF) is a framework which provides a set of in-built APIs that can be used by applications to create, read, update, and manage the credentials securely.

CSF Uses: 

The credential store is mainly used to store the credentials (username and password) to access the service and the applications.

Use Case Scenario: 

We had a requirement to configure the SOA CSF Key Credentials programmatically using an automated process.

Solution

The credential store configuration can be accomplished using a WLST command. In addition, ANT scripts are used for automation.

Step 1: Open a Windows Command Prompt or Linux/Unix Shell Terminal to start the WebLogic Server Administration Scripting Shell utility. Enter the following, depending on the system.

(Windows Command Prompt)

C:\Users\<<username>> cd <<ORACLE_HOME>>\wlserver\common\bin

C:\<<ORACLE HOME>>\wlserver\common\bin> wlst

(Unix/Linux Shell Terminal)

[oracle@myhost ]$ cd <<ORACLE_HOME>>/wlserver/common/bin

[oracle@myhost bin]$ ./wlst

Step 2: At the WLST utility prompt, connect to the Admin Server.

wlst:/offline> connect(‘weblogic’,’welcome1′,’t3://localhost:7001′)

Step 3: Once the user is successfully connected to the Admin Server, the following commands can be executed (see the image below).

(For CSF Key Creation)

createCred(map=<<keyMapName>>,key=<<keyName>>,user=<<keyUser>>,password=<<keyPass>>,desc=<<keyDesc>>)

(For CSF Key Update)

updateCred(map=<<keyMapName>>,key=<<keyName>>,user=<<keyUser>>,password=<<keyPass>>,desc=<<keyDesc>>)

(For CSF Key Deletion)

deleteCred(map=<<keyMapName>>,key=<<keyName>>)

Automating CSF Key Credentials

Automation of CSF Key Credential Configuration at deployment time

ANT Scripts can be used to automate CSF Key Credential Configuration at deployment time.

<target name=”createCSFKeyCred”>
<wlst debug=”false” arguments=”${admin.username} ${admin.password} ${admin.server} ${map} ${keyCredentialsName} ${user} ${password} ${desc}”>
<script>
adminUser=sys.argv[0]
adminPassword=sys.argv[1]
adminUrl=sys.argv[2]
keyMap=sys.argv[3]
keyName=sys.argv[4]
keyUser=sys.argv[5]
keyPass=sys.argv[6]
keyDesc=sys.argv[7]
print(‘Connecting to WLST Server’)
connect (adminUser,adminPassword,adminUrl)
print(‘Creating Security Credentials’)
createCred(map=keyMap,key=keyName,user=keyUser,password=keyPass,desc=keyDesc)
disconnect()
print(‘Disconnecting….’)
</script>
</wlst>
</target>

References

https://docs.oracle.com/cd/E12839_01/core.1111/e10043/csfadmin.htm#CACGIGDB 

Tagged with: , , , , ,

shutterstock_495910507Oracle Fusion Middleware 12c introduces a wealth of new features and capabilities, and continues to surpass expectations in the realm of performance and scalability as the leading strategic Event Stream Processing Platform from Oracle.

Recently, we worked on showcasing the capabilities of Oracle Fusion Middleware to process and analyze a large volume of high velocity data generated by Internet of Things (IoT) devices in real time.  This use case involved measuring the temperature of shipping containers during transit, and reporting on the status in real time.  Since the container contents were temperature sensitive, the temperature had to be maintained within given upper and lower values.  The ideal solution provides real-time alerts and intelligence allowing the user to flag the shipping carrier and make business decisions (such as either recalling or taking corrective action on the shipment) during transit.

As part of the demo, we built an IoT device using off-the-shelf components.  This device contained temperature, humidity, and location sensors, and was set up in a custom-printed 3D enclosure.  In a typical case, there would be multiple sensors deployed across several locations monitoring and measuring various factors as needed, and relaying this information in real time to the backend system for processing and analysis.  To receive and process data from the IoT device, we built a solution using Oracle Complex Event Processing (Stream Analytics) and Business Activity Monitoring to receive and process this data in real time, and present it as a dashboard.

The following videos provide a high level overview of the use case followed by the actual demo.  In the coming weeks, we will guide you on how to build your very own IoT showcase using the Oracle SOA Suite platform.  Stay tuned!

IoT Introduction and Business Use Case

IoT Demo

 

 

Tagged with: , , , , , ,

The GTC (Generic Technology Connector) is used to build connectors for target systems like flat-file imports via FTP or SPML-based provisioning over Web Services. It can be used to integrate target systems that do not need complicated provisioning process flows with OIM.

The GTC can be created in OIM using the web-based point-and-click graphical wizard, which clearly shows the user the data flows that are being defined within the connector. This will reduce the deployment timelines.

In this post, we’ll demonstrate how to resolve a GTC issue while configuring a CSV file for use in reconciliation in OIM11gR2 PS3.

Issue: You may experience the following issue when creating a Flat File Trusted Generic Technology Connector (GTC) when the GTC is saved. We are following standard steps to create a GTC in OIM.

1

The corresponding error appears in the OIM diagnostic log:

<Nov 3, 2016 3:09:20 PM IST> <Error> <XELLERATE.WEBAPP> <BEA-000000> <Class/Method: CreateGenConnectorAction/createGenericConnectorSuccess encounter some problems: java.lang.NullPointerException
oracle.iam.platform.utils.ServiceInitializationException: java.lang.NullPointerException at oracle.iam.platform.Platform.getService(Platform.java:277)
<Nov 3, 2016 3:09:20 PM IST> <Error> <XELLERATE.DATABASE> <BEA-000000> <Class/Method: DirectDB/getConnection encounter some problems: Error while retrieving database connection.Please check for the following

 Database server is running.

 Datasource configuration settings are correct.

java.sql.SQLException: java.sql.SQLException: Exception occurred while getting connection: oracle.ucp.UniversalConnectionPoolException: Cannot get Connection from Datasource: java.sql.SQLRecoverableException: IO Error: Invalid connection string format, a valid format is: “host:port:sid” at com.thortech.xl.util.DirectDB$DBPoolManager.getConnection(DirectDB.java:441)

Solution: Need to correct parameter maxConnections and url values in the OIM-MDS file /db/oim-config.xml, as shown below:

<directDBConfigParams checkoutTimeout="1200" connectionFactoryClassName="oracle.jdbc.pool.OracleDataSource" connectionPoolName="OIM_JDBC_UCP" driver="oracle.jdbc.OracleDriver" idleTimeout="360" maxCheckout="1000" maxConnections="5" minConnections="2" passwordKey="OIMSchemaPassword" sslEnabled="false" url="jdbc:oracle:thin:@[Host DB IP]:1521/orcl" username="DEV_OIM" validateConnectionOnBorrow="true">

Existing Value:

MaxConnections = 5
Url = jdbc:oracle:thin:@[Host DB IP]:1521/orcl

Sample Expected Value:

MaxConnections = 25
Url = jdbc:oracle:thin:@<OIM_DB_HOST_IP>:<OIM_DB_PORT>/<OIM_DB_SID>

Note: URL value above should contain the correct value of the host, port, and SID of OIM database.

We have two approaches to performing the above changes:

Approach 1: Use the standard EM console.

Approach 2: Use weblogicExportMetadata.sh & weblogicImportMetadata.sh standard OOB OIM utility.

We’ll go into more detail below.

APPROACH 1 – Using the standard EM console

Using an EM console should be the preferred approach since it reduces the possbility of making errors with the configuration files. The existing values should be verified and saved as a backup before being changed, using the steps mentioned below –

 Step 1: Log in to OIM-EM console http://<EM_HostIP>:< EM_HostPort>/em

Step 2: Click on Identity and Access > OIM > oim(11.1.2.0.0)

 2

Step 3: Click on drop down Oracle Identity Manager > System MBean Browser

 3

Step 4: Click on Application Defined MBeans > oracle.iam

 4

Step 5: Click on XMLConfig > Config

5

Step 6: Click on XMLConfig.DirectDBConfig > DirectDB

Make below changes and click on Apply.

MaxConnections = 25
Url = jdbc:oracle:thin:@<OIM_DB_HOST_IP>:<OIM_DB_PORT>/<OIM_DB_SID>

6

 

Step 7: Verify the changes.

APPROACH 2 –  Using weblogicExportMetadata.sh & weblogicImportMetadata.sh standard OOB OIM utility

If you wish to get your hands dirty and perform the changes manually using the command line, here are the steps. It is recommended that the EM console be used for such changes in order to avoid unnecessary edits. Always make a backup copy of the files you’ll be editing before making any changes.  Use the weblogicExportMetadata.sh & weblogicImportMetadata.sh standard OOB OIM utility available under $MW_HOME/Oracle_IDM/server/bin.

Please note that oim-config.xml file will be exported under the standard /db folder (under sample /tmp/export_04112016 directory). In the steps below MW_HOME refers to Middleware installed folder and OIM_ORACLE_HOME refers to IOM installed folder.

Step 1: Check parameters values in the file weblogic.properties under /app/oracle/middleware/Oracle_IDM/server/bin/

##Weblogic Server Name on which OIM application is running
wls_servername=oim_server1 
application_name=OIMMetadata
metadata_from_loc=/tmp/import_04112016

#Folder location from where the updated /db/oim-config.xml is located
metadata_to_loc=/tmp/export_04112016

#Folder location where existing /db/oim-config.xml will be exported
metadata_files=/db/oim-config.xml

Step 2: Go to $MW_HOME/Oracle_IDM/server/bin  & run ./weblogicExportMetadata.sh

[oracle@myhost bin]$ export OIM_ORACLE_HOME=/app/oracle/middleware/Oracle_IDM

[oracle@myhost bin]$ ./weblogicExportMetadata.sh

Below is the folder structure once the export is done successfully:

7

Step 3: Modify the oim-config.xml file as suggested under the Solution section above, and place it under /tmp/import_04112016/db

Step 4: Go to /app/oracle/middleware/Oracle_IDM/server/bin & run ./weblogicImportMetadata.sh

[oracle@myhost bin]$ export OIM_ORACLE_HOME=/app/oracle/middleware/Oracle_IDM

[oracle@myhost bin]$ ./weblogicImportMetadata.sh

Step 5: Verify the changes using Step 2.

Now, create the GTC again using the standard steps. If the issue persists, restart Admin and OIM-managed server after clearing the temp cache directory.

As always, if you have any questions regarding this process or our provided solution, please do not hesitate to post a comment below, and our team will get back to you.

Tagged with: , , ,

We recently implemented a complete Oracle Fusion Middleware 12c (12.2.1.1) Stack for a client based on Oracle SOA Enterprise Deployment Guide (EDG). The design focus was to implement a highly available and scalable clustered environment which contained OSB, SOA, MFT, OWSM, ESS, BAM managed servers, and a highly available Admin Server. Each of the managed servers had their own dedicated VM with an active-passive Admin Server cluster. We performed extensive tuning and load testing to make sure the system can function under demand.

However, as we migrated to higher environments, the deployments screen would take a long time to render, even though all the deployed applications functioned as expected without any issue. It sometimes resulted in timeouts and would cause application deployments to fail. The overall performance of both the WebLogic Console and Enterprise Manager was sluggish, particularly affecting the deployment screen, with wait times of over 30 mins to render a single click action! If some of the servers were shutdown, the load time would improve slightly, but it was not a viable option to keep the server shut down. From the log files, everything appeared to be normal and there were no error messages.

Working together with Oracle Support, we noticed that in some of the thread dumps and Java Flight Recordings, a few of the managed beans took too long to respond and this was later identified as a known defect. In order to confirm that this was the issue, we disabled the new thread self-tuning functionality added in this release of WebLogic Server. In order to disable the new thread self-tuning functionality, we added the following JVM start-up parameter to all the WebLogic Servers and restarted.

-Dweblogic.UseEnhancedIncrementAdvisor=false   

After the restart, the WebLogic Console was significantly improved and the deployment screen would load in a few seconds, a great improvement from the 30 minutes before the thread tuning setting was applied.

This verified that their performance issue was due to the newly added functionality, and the temporary workaround was to disable it. At the time of writing this post, Oracle has released a patch that fixes this issue. The high-level steps are as follows:

  1. Remove the parameter -Dweblogic.UseEnhancedIncrementAdvisor=false from all WebLogic Server startup
  2. Apply Patch 23762529 for WLS 12.2.1.1.0 to all the servers in the domain
  3. Restart all the servers and test by logging in to console and clicking on the Deployments tab
  4. If it works, then apply Patch 24901211 for WLS 12.2.1 to all servers in the domain
  5. Restart all the servers and test

This should resolve the performance issues caused by the new thread self-tuning functionality.

Tagged with: , , ,