Quantcast
Channel: SCN : Discussion List - SAP HANA and In-Memory Computing
Viewing all 5653 articles
Browse latest View live

Authorization issue in activating Attribute view using Tables from different schema

$
0
0

Hi Experts,

 

I have create 1 Schema 'XYZ' in HANA and this contains some of the Tables i.e. MARA.

 

Now i have ran the below 2 statements to grant other users 'A' access to my Schema 'XYZ'.

 

GRANT SELECT ON SCHEMA XYZ TO A WITH GRANT OPTION;

GRANT EXECUTE ON SCHEMA XYZ TO A WITH GRANT OPTION;

 

But still the User 'A' is facing authorization while creating attribute views on Table MARA.

 

Please suggest.

 

thanks


HANA views connection to Tibco Spotfire

$
0
0

Hello,

 

I am working with Tibco Spotfire version 6.5. I can consume tables in HANA system but I am unable to consume the HANA views. Can I connect to HANA views from the tool? If so, how do I? Thanks in advance!

Looking for study material for SAP HANA - Implementation and Modelling (HA 300)

$
0
0

Dear All,

 

I am planning to do SAP HANA - Implementation and modeling course through SAP Bangalore on 8 Oct, 2012. Before attending classes, i want to go through study material.  It will help me for certification too.

 

So, I request you to pls. forward me links or study material for the same.

 

email : v.b.divekar [at] gmail.com

 

Regards,

Vikram

How to retrieve the results of an EXECUTE IMMEDIATE query call

$
0
0

Hi all,

 

I am creating a procedure within HANA, in which I would like to receive a tablename as input parameter to select some data from the certain table.

The only solution I found, was to use EXECUTE IMMEDIATE and concatenate the tablename to a query string.

 

Unfortunately I have no idea, how to get the result of the query. According to the documentation there must be a "Result Iterator", which can run through the data, but there is no information, how the iterator is called or respectively used.

 

Has anyone experienced EXECUTE IMMEDIATE or another solution to retrieve data from a dynamic table?

 

Thank you and best regards,

Fabian

Limitations in importing DSOs into HANA

$
0
0

I am on BW 7.4 SP5 HANA SP7 and have noticed that only 'Standard' DSO's can be imported into HANA. The 'Write Optimized' and 'Direct Update' DSO's are not available for selection in HANA Studio.

 

As I have not found any documentation explaining this limitation and the F1-Help regarding DSO's only says:

 

  • DataStore
    • SID Generation must not be Never create SIDs

 

I thought I should ask if this is a bug or a feature or myself doing something wrong?

How to Create a Table with Merge and partitions in HANA

$
0
0

Hi,

 

What is the best way to create a Table with MERGE and PARTITION and UNLOAD PRIORITIES.

 

Any body can you please give me some examples.

 

Regards,
Deva

Row tables Or Column Tables Or both?

$
0
0

I would like to know the basic approach in order to build a new model  in hana ( row tables or column tables or both) ?

 

Suppose I've a source system for example excel or .csv..etc and different excels consists of different types of data that includes both dimensions and measures and each source file consists of around 500 fields. The source feed comes every day and I need to load each and every field in the database.

 

Out of these 500 fields and assume 300 are dimensions and 200 are facts, and for reporting for purpose i just need 150 dimensions and 80 measures and some calculated measures. (Infuture, I may need to consider some more dimensions and facts) .

 

Now my question is,

 

As the source feed comes every day and I need to load all the fields,

 

Do I need to create first row tables first as Row tables are preferred for insert operations or Can I go a head with column tables directly?

 

I just want to know the guidelines to follow , where we need to load some thousand of fields and huge number of rows and at the same time my modeling should be good for reporting as well.

 

What I am not able to catch is, SAP HANA recommends not to combine row tables and column table for operations otherwise first I load all the data into row tables and then create column tables specific to reporting purpose by using row tables' data? ( I can create some stored procs to load data from row tables to columns tables and run them after data load completes)

 

Please let me know your inputs.

 

 

Thanks,

Sree

HANA LIVE deployment error

$
0
0

Dear  All,

 

We are facing issue while deploying the HANA Live for ERP. Our HANA DB
Version is REV61. The LCM Version 1. Patch 7 update 0 (1.0.7.0)
We are implementing the SAP HANA Live for ERP and SAP HANA Live for
EHP4 for ERP.

 

On Deploying SAP HANA Live for ERP the LCM halts with the below error
message.


Error Message:

Repository request failed,cause :Repository:Activation failed for atleast one object;Atleast one error was encountered during activatio.Please see the CheckResult information for detailed information about the root cause.All Objects without errors have been activated.,code:40,136, argument;null For more details see/usr/sap/hlm_bootstraps/H21/HLM/log/object_activation_result_01.log

 


Help: http://help.sap.com/hana

 

All the tables mentioned in note  1782065 and 1781992 have been
implemented. The tables are listed as created in the data provisioning
screen

 

Thanks,

Rajiv


Delta Merge Log

$
0
0

Hi Team,

 

 

I was trying to understand the delta merge topic ,
From cost function Calculation I noted a parameter called

DLS  - Delta log size [MB]

 

What does this space contains .

 

After that i noted this term in many other part of this documentation , can someone help to understand this .

 

 

 

Thanks,

Razal

How to drop a procedure, if already exists

$
0
0

Hi,

 

I want to run an sql script in Hana Studio that will drop a procedure if this procedure already exists.

Otherwise nothing will happen.

 

Thank you in advance.

D.E.

Stored Procedure in HANA Development perspective (XS Project)?

$
0
0

Hi,

 

How to create a procedure for performing DML operations  in SAP HANA Development perspective?

 

 

 

The default mode shows "READS SQL DATA" , which doesn't allow DML operations i.e. insert/update/delete/dynamic  sql..etc

 

 

CREATE PROCEDURE EMPLOYEE_DETAILS ( )

  LANGUAGE SQLSCRIPT

  SQL SECURITY INVOKER

  --DEFAULT SCHEMA <default_schema_name>

  READS SQL DATA

  AS

BEGIN

/*****************************

  Write your procedure logic

*****************************/

 

select * from employee;

END;

 

Could you please respond to this?

 

 

 

Thanks,

Sree

Error when loading Unit Conversion Tables

$
0
0

Hello.

 

I'm loading the currency conversion tables into HANA using DataServices, that is the following tables:

- T006

- T006D

- T006A

 

according to the following guide:Associate Measures with Unit of Measure - SAP HANA Modeling Guide - SAP Library

 

The problem I'm having is that DS is displaying the following error:

 

ScreenShot 09-03-14 at 10.38.24 AM.png

 

It says that the value of the field MSH3 of the table T006A is too large for the target table which in HANA is the same.

In the mapping process the fields are the same and have the same data type and length. Is this some kind of bug o something like lowercase letters are not allowed in HANA??

 

ScreenShot 09-03-14 at 10.43.09 AM.png

Thanks for your help.

Passing Set of values to Calculation View on SQL Procedure

$
0
0

Hi All,

We have a scenario where we need to pass the set of values at a time.

current SQL statement

SELECT  "V_DATE",

                                         SUM("ITEMS"),

                                         SUM("HOURS"),

                                         sum("COUNTERS")

                                         FROM "_SYS_BIC"."*********"

                                         ( "$$DATEFROM$$" => :VAR_DATE )

                                                  

                                         GROUP BY "V_DATE".

 

In above statement we can pass only one value at a time but that is not usefull for us.

we have used the above select statement with in a loop to pass all desired values but that is having performance issues.

Here based on DATEFROM we are doing one calculation in Calculation View hence we can't distrub that view ( it is for to get open items for a day)

 

Thanks in Advance.

Regards

Jagan

Viewing the SQL within the standard HANA procedures

$
0
0

Hi all,

 

I would like to see the SQL to be executed when I call for the procedure e.g. CHECK_TABLE_CONSISTENCY

Is that possible?

 

 

I tried to click on Catalog > SYS > Procedures > Right click on the procedure & "Open Definition"
I can see there's a "create procedure" statement, but do not see the statement that would be executed by this procedure.

 

Please advise.


Thanks,
Henry

Creating decision table in HANA studio

$
0
0

Hi Experts,

       Can any one help me on,

          What is decision table?

          What are the advantages of using decision table?

          How to create decision table?and also give me one sample scenario

Regards,

Hityshi Gullipalli.


SAP SLT system restart

$
0
0

Hi,

 

If we have couple of schemas running in SLT and we would need to restart the SLT box what are the prerequisites we need to consider? If yes, please provide the steps before we take the SLT system down.

 

Regards,

Sohil Shah.

sFIN (simple Finance) and Smart Finance on HANA

$
0
0

Hello SAP HANA friends

 

I read this document about sFIN:

http://www.saphana.com/docs/DOC-4263

 

I have seen also documents about Smart Finance, are there any differences between sFIN and Smart Financials?

 

Thanks

Thomas

Master Data from BW -> HANA

$
0
0

Hi Experts,

 

Can you please suggest how can attribute views created based upon BW Master Data Info Objects. For example 0Material .

 

As we know that we views gets created in HANA automatically once the infocude or ODS is activated in BW on HANA. Similarly how can we create attribute views in HANA based upon Master Data info obejcts in BW.

 

Kindly suggest.

 

Thanks

Not able to do data preview in calculation view

$
0
0

Hi Experts,

 

I have created a Calculation view and the model is as follows:

sdn issue.jpg

Filter Expression in Projecttion DTR is as follows:

 

sdn issue 2.jpg

As you can see here i have 2 input parameters in my model DATE_FROM and DATE_TO so based upon the input by the end user i am trying to restrict the data using these parameters for 'INVOICE DATE'.

 

Model Description: Projection_1 & Project_2 consists of 1 table each. These Tables are joined in Join_1. Input Parameters are defined in Projections so exist in all the above objects in the Model. Projection 'DTR' with valid Filter Expressions based upon input parameters as stated above.

 

Issue: I am not able to do Data Preview in this view it is throwing the following errorsdn issue 3.jpg

 

I am able to see data till Join_1 but after that it is always throwing this error.

 

Please suggest.

 

Your help would be highly appreciated.

 

Note: I have checked and gone through with all the related post for this but still the issue persists.

 

Thanks



   
 
 
 
 
 
 
 
 
 
 
 





Timestamp column drops milliseconds in XSJS query with resultSet.getMetaData?

$
0
0

We've found an interesting issue and wanted to bring it up here for review.   When we call a query from an XSJS script and try to process the result set, if we use the resultSet.getMetadata() method, there appears to be some locale conversion happening and we lose the milliseconds from our timestamp column.   We noticed this when using the resultSetToJSON function in the sap/hana/democontent/epmNext/services/session.xsjs file from the SHINE Demo.  We use the resultSetToJSON function in our application because it is generic. 

 

Technical details: AWS developer image of Rev 80.

 

Sample code and results:

Table Definition:

Table Definition:

// testTimestamp.hdbtable
table.schemaName  = "MEDPORTAL";
table.tableType   = COLUMNSTORE;
table.description = "Timestamp Test";
table.columns = [      {name = "ID";                sqlType = INTEGER   ; nullable = false; comment = "Unique ID " ; },      {name = "CURRENT_TIMESTAMP"; sqlType = TIMESTAMP ; nullable = true ; comment = " "          ; }
];
table.primaryKey.pkcolumns = ["ID"];

XSJS test code for testTimestamp.xsjs:

Note, we've added some debugging code to allow visibility by adding the parameter debug=true to the url.

var bDebug        = false;
var oMethod       = $.request.method;
var oContent      = $.request.contentType;
var oParams       = $.request.parameters;
var oParamName    = '';
var oParamValue   = '';
var oParamDebug   = '';
var sHTML         = '';
var sBodyText     = '';
var oPayload      = null;
var oMessage      = 'Request Succeeded';
var oStatus       = $.net.http.OK;
var oResponse  = '';
// -------------------------------------------------------------------------- //
function addBodyDebugText( aText ) {  // ------------------------------------------------------------------------ //  if (bDebug) {    sBodyText += "\n" + aText;    //$.response.setBody( sBodyText );  }
}
// -------------------------------------------------------------------------- //
function addBodyText( aText ) {  // ------------------------------------------------------------------------ //  sBodyText += "\n" + aText;  //$.response.setBody( sBodyText );
}
// -------------------------------------------------------------------------- //
function escapeSpecialChars( aInput ) {  // ------------------------------------------------------------------------ //  var sOutput = '';  if (aInput) {   sOutput = aInput;   sOutput.replace(/[\\]/g, '\\\\');   sOutput.replace(/[\"]/g, '\\\"');   sOutput.replace(/[\/]/g, '\\/');   sOutput.replace(/[\b]/g, '\\b');   sOutput.replace(/[\f]/g, '\\f');   sOutput.replace(/[\n]/g, '\\n');   sOutput.replace(/[\r]/g, '\\r');   sOutput.replace(/[\t]/g, '\\t');  }  return sOutput;
}
// -------------------------------------------------------------------------- //
function resultSetToJSON( aResultSet, aResultSetName ) {  // ------------------------------------------------------------------------ //  var oMetadata;  var iColumnCount;  var aValues=[];  var aTable=[];  var oValue="";  var iColumn;  var oTimestamp;  var oTimestampStr;  if (!aResultSetName) {    aResultSetName = 'entries';  }  oMetadata    = aResultSet.getMetaData();  iColumnCount = oMetadata.getColumnCount();  while (aResultSet.next()) {    for (iColumn=1; iColumn<=iColumnCount; iColumn++) {      oValue = '"'+oMetadata.getColumnLabel(iColumn)+'" : ';      switch (oMetadata.getColumnType(iColumn)) {        case $.db.types.VARCHAR:        case $.db.types.CHAR:          oValue += '"'+ escapeSpecialChars(aResultSet.getString(iColumn))+'"';          break;        case $.db.types.NVARCHAR:        case $.db.types.NCHAR:        case $.db.types.SHORTTEXT:          oValue += '"'+escapeSpecialChars(aResultSet.getNString(iColumn))+'"';          break;        case $.db.types.TINYINT:        case $.db.types.SMALLINT:        case $.db.types.INT:        case $.db.types.BIGINT:          oValue += aResultSet.getInteger(iColumn);          break;        case $.db.types.DOUBLE:          oValue += aResultSet.getDouble(iColumn);          break;        case $.db.types.DECIMAL:          oValue += aResultSet.getDecimal(iColumn);          break;        case $.db.types.REAL:          oValue += aResultSet.getReal(iColumn);          break;        case $.db.types.NCLOB:        case $.db.types.TEXT:          oValue += '"'+ escapeSpecialChars(aResultSet.getNClob(iColumn)) + '"';          break;        case $.db.types.CLOB:          oValue += '"'+ escapeSpecialChars(aResultSet.getClob(iColumn)) + '"';          break;        case $.db.types.BLOB:          oValue += '"'+ $.util.convert.encodeBase64(aResultSet.getBlob(iColumn)) + '"';          break;        case $.db.types.DATE:          oValue += '"' + aResultSet.getDate(iColumn) + '"';          break;        case $.db.types.TIME:          oValue += '"' + aResultSet.getTime(iColumn) + '"';          break;        case $.db.types.TIMESTAMP:          oValue += '"' + aResultSet.getTimestamp(iColumn) + '"';          break;        case $.db.types.SECONDDATE:          oValue += '"' + aResultSet.getSeconddate(iColumn) + '"';          break;        default:          oValue += '"' + escapeSpecialChars(aResultSet.getString(iColumn)) + '"';      }      aValues.push(oValue);    }  aTable.push('{'+aValues+'}');  }  return( JSON.parse('{"'+ aResultSetName +'" : [' + aTable +']}') );
} // resultSetToJSON()
// -------------------------------------------------------------------------- //
function rowsToJSON( aResultSet ) {  // ------------------------------------------------------------------------ //  var aJsonRows = [];  var oMetadata;  var iColumnCount;  oMetadata    = aResultSet.getMetaData();  iColumnCount = oMetadata.getColumnCount();  addBodyDebugText( "iColumnCount:  " + iColumnCount.toString() );  while (aResultSet.next()) {    aJsonRows.push({      "ID":        aResultSet.getInteger(1),      "Timestamp": aResultSet.getTimestamp(2)    });  }  return aJsonRows;
}
// -------------------------------------------------------------------------- //
function testTimestamp() {  // ------------------------------------------------------------------------ //  var oConnection;  var sQuery;  var oStatement;  var oResultSet;  var oResultJson1;  var oResultJson2;  var oMessage;  sQuery  = 'SELECT * FROM "MEDPORTAL"."adsm.data::testTimestamp"';  addBodyDebugText( "sQuery:  " + sQuery );  try {    oConnection  = $.db.getConnection();    oStatement   = oConnection.prepareStatement( sQuery );    oResultSet   = oStatement.executeQuery();    oResultJson1 = resultSetToJSON( oResultSet );    addBodyDebugText( "oResultJson1:  " + JSON.stringify( oResultJson1 ) );    oResultSet   = oStatement.executeQuery();    oResultJson2 = rowsToJSON( oResultSet );    oResultJson2 = JSON.stringify( oResultJson2 );    addBodyDebugText( "oResultJson2:  " + oResultJson2 );    $.response.status = $.net.http.OK;  } catch (oError) {    $.trace.error( "DB exception, " + oError.toString() );  addBodyText( oError.message );  $.response.status = $.net.http.INTERNAL_SERVER_ERROR;  } finally {    if (oConnection) {      oConnection.close();    }  }
}
//-------------------------------------------------------------------------- //
//--------------------------- SCRIPT ENTRY POINT --------------------------- //
//-------------------------------------------------------------------------- //
if ($.request.parameters.length > 0) {  oParamDebug   = $.request.parameters.get('debug');
}
if (oParamDebug === 'true') {  bDebug = true;
}
// Log incoming request
$.trace.info( "testTimestamp.xsjs:  " + "Incoming Request" );
$.trace.info( "      Content Type:  " + oContent           );
$.trace.info( "       Method Type:  " + oMethod            );
$.trace.info( "        Parameters:  " + oParams            );
// Only process GET requests
if (oMethod === $.net.http.GET) {  testTimestamp();  // Return the response  $.response.contentType = 'application/json; charset=UTF-8';  sHTML = sBodyText;  $.response.setBody( sHTML );
} else {  // Method was not GET:  fail the request.  oMessage          = 'This URI supports only GET requests';  oStatus           = $.net.http.METHOD_NOT_ALLOWED;  $.response.status = oStatus;  addBodyText( oMessage );
} // if (Method === GET)

And finally the results in the browser with debug=true:

sQuery: SELECT * FROM "MEDPORTAL"."adsm.data::testTimestamp"
oResultJson1: {"entries":[{"ID":1,"CURRENT_TIMESTAMP":"Thu Sep 04 2014 09:35:52 GMT+0000 (UTC)"}]}
iColumnCount: 2
oResultJson2: [{"ID":1,"Timestamp":"2014-09-04T09:35:52.368Z"}]

Is this a known problem/feature that we missed in the documentation or is this a bug?

 

Thank you,

 

Jim Giffin

Viewing all 5653 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>