Skip to Main Content
IBM Data and AI Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com


ADD A NEW IDEA

All ideas

Showing 14640

Backup/restore using Google Cloud Storage

The possibility to define a Google Cloud Storage as a target for the backup (in similar way like I can use today a S3 compatible storage). In this moment is not certified, but a lot of customer use GCS as standard storage for his backup.
8 months ago in Cloud Pak for Data Platform 0 Planned for future release
199 VOTE

Reduce Time Required for Online(Inplace) Reorg for huge tables

1) Inplace reorg will take time more for the tables with huge data say for example 1 TB 2) To Reduce the time for this we want to avoid full table scan hence we need to look in portions of the data in a table where it is getting fragmented 3) Henc...
over 4 years ago in Db2 / Tables/Index (Range MQ MDC etc.) 21 Not under consideration

Request is to make WLM Effective only on primary, on STANDBY it should not be in effect

Why is it useful? It helps us restrict the access to users who has readonly access on standbys to accidentlally connect to primary and run some rogue commands. who would benefit from it? We as DBAs, This gives us confidence on how we can use the r...
about 1 month ago in Db2 / Workload Management 2 Submitted

Configure multi-user login for DVM JGATE admin site

Scenario: Currently the only way to access the DVM JGATE site (https://db2_subsystem:port/gateway) is to use the admin username and password established during the DVM installation. The data sources within the interface are managed by us, the DB2 ...
3 months ago in Data Virtualization Manager for z/OS 0 Future consideration

Data captured in long transactions with guaranteed delivery to destination

It is usefull because sometimes a very long transaction pass through the Capture process but can´t be delivered causing memory lack at destination. The benefit would be not losing the data transferred to the destination. One suggestion would be to...
5 months ago in Replication: Q-Replication & Availability / Q Replication for Z 2 Needs more information

Improve Space Search Performance for Nearly Full PBR

Currently, when there are many concurrent inserts on a PBR which is nearly full, CPU and elapsed time for inserts increases by many orders of magnitude, to the point where it can completely monopolize the CPU on an LPAR. In past discussions, I was...
27 days ago in Db2 for z/OS 0 Under review

Set the Content Expiry header in spreadsheet services

Asking users to clear their browser cache when they change a websheet across a large company is difficult as not all users know how to do it. To facilitate that, we can set the content expiry header to 8 hours, which means the user browser will re...
about 1 month ago in Planning Analytics 0 Submitted

Marking a project as sensitive on Default Git integration

One big restriction of CP4D git integration for KB Bank is that the customer need to set Sensitive to projects by their security policy but it’s not available with Default Git integration. As the option is available with Deprecated Git integration...
4 months ago in watsonx.ai 0 Under review

MFA for Datastage Connections

We are undergoing a security assessment and per IA-2(1) for privileged access and IA-2(2) for non-privileged users (see below link and attachment for official documentation) we need to implement Multifactor Authentication for Privileged or Non-Pri...
10 days ago in DataStage 0 Submitted

Audit Info: Capture failed login user id in Splunk logs.

In it's current state, CP4D audit logs do not capture failed login attempts user id's in logs that are forwarded to Splunk. This puts us in violation of GIS Audit policy non-compliance. Additional info available in IBM Support Case # TS016073601.
10 days ago in Cloud Pak for Data / Cloud Pak for Data Platform 0 Submitted