Quantcast
Channel: SCN : Discussion List - SAP HANA and In-Memory Computing
Viewing all 5653 articles
Browse latest View live

SAP HANA Studio

$
0
0

Hello Guys,

 

I am starting with hana development.

 

Where can i download the SAP Hana Studio ?

 

Thanks.


Copy system schema content to another schema

$
0
0

Hi Team,

 

We have requirement where we want to Export the system schema full data (including indexes , Table , Function etc) and import the same to another schema say system1 .so that if we are going to change any table inside system1 schema ,should not effect the system schema,,,

What will the shortest and simplest method for doing this.

 

Your suggestion will be really appreciable .

 

Regards

Rableen

DMO showed incorrectly

$
0
0

Hello,

 

I'm doing the migration to hana with DMO and I'm getting this screen during the process.

 

111.JPG

 

I can't select the HDB like as a Target Database.

 

The screen should appear like this

 

112.JPG

Any suggestion about this behaviour?

 

Regards

How to get filters and prompts to universe from cal view

$
0
0

Hi All,

 

 

Recently created some cal view when i am creating universe against  cal view

 

it is not getting filters and prompts to universe

 

Please advise How to get filters and prompts to Universe in WEBI

SQLscript procedure - Performance optimisation

$
0
0

Hello Experts,


Continuing from the below discussion :

SQLscript procedure - Avoid materialisation of Internal tables

 

and thanks to valuable inputs form Lars Breddemann I was able to zero in on the problem area.

 

What I learnt in the process(screenshot from planviz attached ) is that two statements take up 99% of time.

I have pasted the two statements below for reference ( I tried my best to better format sql statements for readability as it is not always the best approach to copy past sql statements )

 

Also, these two statements form the heart of the functional  logic.

This is part of a Revenue allocation code, where revenues ( or costs ) is allocated from  a sender to a receiver -

based on  characteristics from sender and receiver. (join condition ).


Distribution base is used to create the multiplying factor,

and allocation is done using - S."AMT"  * (R."DIST_BASE" / D."DIST_BASE" ) - ( a keyfigure in Receiver is defined as receiver_distbase )

 

I'm not  a functional expert in this topic, and sharing what I understood from functional expert.

 

Not going too much into functional topic,

Wanted to understand what is wrong with the below sqls.


I could already see that case statement is a row operation and will slow down the prcess drastically.

also "IN" in join condition which is also a performance glitch.

I understood from functional expert that this is necessary .

 

these two statements take around 20 seconds each which i feel is a lot  considering data volume.


Any pointers on how I should proceed.??

 

Thanks in advance.

 

Best Regards,
Vinay

 

 

 

 

 

 

lt_sender2 - 65 rows.

lt_distbase  - 65 rows

lt_receiver - 136535 rows.

 

 

 

 

 

  lt_distbase =  SELECT ROW_NUMBER( ) OVER ( ) AS "FS_PER_D_ROW_", 1 AS "FS_PER_DUMMY_", S."YEAR_MM", S."DEPT_CD", S."DEPT_GRP", S."UPPER_DEPT_GRP",

  CASE

   WHEN  SUM( IFNULL( R."DIST_BASE", 0 ) ) = 0  THEN TO_DOUBLE(1)

   ELSE  SUM( R."DIST_BASE" )

  END

  AS "DIST_BASE"

  FROM

:lt_sender2 AS S

  LEFT OUTER JOIN :lt_receiver AS R

  ON ( S."FS_PER_DUMMY_" = R."FS_PER_DUMMY_" ) AND

( S."YEAR_MM" IN ( R."YEAR_MM", R."FS_PER_YEAR_MM" )) AND

( S."DEPT_CD" IN ( R."DEPT_CD", R."FS_PER_DEPT_CD" )) AND

  ( S."DEPT_GRP" IN ( R."DEPT_GRP", R."FS_PER_DEPT_GRP" )) AND

( S."UPPER_DEPT_GRP" IN ( R."UPPER_DEPT_GRP", R."FS_PER_UPPER_DEPT_GRP" ))

 

  GROUP BY S."YEAR_MM", S."DEPT_CD", S."DEPT_GRP", S."UPPER_DEPT_GRP";

 

 

 

 

 

  lt_allocation = SELECT  S."FS_PER_S_ROW_", S."FS_PER_DUMMY_", IFNULL(R."FS_PER_R_ROW_", 0) AS "FS_PER_R_ROW_",

IFNULL( D."FS_PER_D_ROW_", 0) AS "FS_PER_D_ROW_", IFNULL( R."YEAR_MM", S."YEAR_MM" ) AS "YEAR_MM",

IFNULL( R."PLCY_NO", TO_NVARCHAR('') ) AS "PLCY_NO",  IFNULL( R."CLERK_NO", TO_NVARCHAR('') ) AS "CLERK_NO",

   IFNULL( R."DIST_TYPE", TO_NVARCHAR('') ) AS "DIST_TYPE",  IFNULL( R."COST_CENTER", TO_NVARCHAR('') ) AS "COST_CENTER",

   IFNULL( R."DEPT_CD", S."DEPT_CD" ) AS "DEPT_CD",  IFNULL( R."OBJ_TYP", TO_NVARCHAR('') ) AS "OBJ_TYP", 

IFNULL( R."OBJ_SEQ", TO_NVARCHAR('') ) AS "OBJ_SEQ",  IFNULL( R."PROD_CD", TO_NVARCHAR('') ) AS "PROD_CD",

IFNULL( R."PROD_GRP", TO_NVARCHAR('') ) AS "PROD_GRP", IFNULL( R."MANDT", TO_NVARCHAR('') ) AS "MANDT",

IFNULL( R."DEPT_GRP", S."DEPT_GRP" ) AS "DEPT_GRP",  IFNULL( R."UPPER_DEPT_GRP", S."UPPER_DEPT_GRP" ) AS "UPPER_DEPT_GRP",   IFNULL( ( S."AMT" * 100 / 100 * R."DIST_BASE" / D."DIST_BASE" ) , S."AMT" ) AS "AMT"

  FROM

:lt_sender2 AS S

  LEFT OUTER JOIN :lt_receiver AS R

  ON ( S."FS_PER_DUMMY_" = R."FS_PER_DUMMY_" ) AND

( S."YEAR_MM" IN ( R."YEAR_MM", R."FS_PER_YEAR_MM" )) AND

  ( S."DEPT_CD" IN ( R."DEPT_CD", R."FS_PER_DEPT_CD" )) AND

( S."DEPT_GRP" IN ( R."DEPT_GRP", R."FS_PER_DEPT_GRP" )) AND

  ( S."UPPER_DEPT_GRP" IN ( R."UPPER_DEPT_GRP", R."FS_PER_UPPER_DEPT_GRP" ))

 

  LEFT OUTER JOIN :lt_distbase AS D

  ON ( S."FS_PER_DUMMY_" = D."FS_PER_DUMMY_" )

AND ( S."YEAR_MM" = D."YEAR_MM" )

AND ( S."DEPT_CD" = D."DEPT_CD" ) AND

  ( S."DEPT_GRP" = D."DEPT_GRP" ) AND

( S."UPPER_DEPT_GRP" = D."UPPER_DEPT_GRP" );

Errors with HANA Hadoop Integration

$
0
0

Hi,


I'm trying HANA Hadoop Integration using SAP HANA SPS09, Hadoop 2.7.1 and HIVE 1.2.1.

I have followed the videos on the youtube (SAP HANA Academy - SDA : Hadoop Enhancements - 1. Creating User [SPS09] - YouTube) and also read the HANA documents (Administration Guide) on this topic.


All I want to do is to follow the video to run WordCount using HANA Hadoop Integration.

So I have assigned all the necessary privileges to the user and created MR Job Archives, Remote Source and Virtual Functions as shown below:


I have created the mrjobs.

package.PNG

 

Then I have run the two SQLs successfully.

 

create remote source hadoop_source

adapter "hadoop"

configuration 'webhdfs_url=http://myserver:50070;webhcat_url=http://myserver:50111'

with credential type 'PASSWORD'

using 'user=hduser;password=hdpass';


 

create virtual function HADOOP_WORD_COUNT()

RETURNS TABLE ("word" NVARCHAR(60), "count" integer)

package DEV."dev.HanaShared::WordCount"

CONFIGURATION 'enable_remote_cache=true;mapred_jobchain=[{"mapred_input":"/data/mockingbird/",

"mapred_mapper":"sap.WordCountMapper",

"mapred_reducer":"sap.WordCountReducer"}]'

AT HADOOP_SOURCE;

 



Currently I'm experiencing an issue when executing the last step.

So I wish to check the results:


select * from HADOOP_WORD_COUNT();

 

But then the console gives me these errors:


Could not execute 'select * from HADOOP_WORD_COUNT()' in 74 ms 384 µs .

SAP DBTech JDBC: [2048]: column store error: search table error:  [2620] executor: plan operation failed;

 

I went to check the log file of my HANA and found these:


pop = executorPy.ceCustomCppPop() # pop1

pop.setNodeName('$$_SYS_SS2_RETURN_VAR_$$HADOOP_WORD_COUNT')

pop.setUseInternalTable()

pop.addViewAttribute('word', datatype=83, intDigits=60, sqlType=37, sqlLength=60)

pop.addViewAttribute('count', datatype=73, sqlType=3)

pop.setLocale('BINARY')

pop.setUserSchema('DEV')

pop.addPlanDebugOpDataInfo(, scenarioName = 'DEV:_SYS_SS_CE_166142_139899986562304_3_TMP')

pop.addKeyValuePair( 'CONFIGURATION','enable_remote_cache=true;mapred_jobchain=[{"mapred_input":"/data/mockingbird","mapred_mapper":"sap.WordCountMapper","mapred_reducer":"sap.WordCountReducer"}]')

pop.addKeyValuePair( 'PACKAGE_NAME','dev.HanaShared09::WC')

pop.addKeyValuePair( 'PACKAGE_SCHEMA','DEV')

pop.addKeyValuePair( 'REMOTE','HADOOP_SOURCE')

pop.addKeyValuePair( 'RETURN_TYPE_INFO','[{"ftcType":37,"index":0,"length":60,"name":"word","scale":0},{"ftcType":3,"index":1,"length":0,"name":"count","scale":0}]')

pop.addKeyValuePair( 'VUDF_NAME','DEV_HADOOP_WORD_COUNT')

pop.setPopId(2251)

pop.addViewAttr('count')

pop.addViewAttr('word')

pop.setExecuteUser('DEV')

[16093]{328130}[29/-1] 2015-10-21 14:24:36.867483 e cePlanExec       cePlanExecutor.cpp(07145) : Error during Plan execution of model DEV:_SYS_SS_CE_166142_139899986562304_3_RET (-1), reason: executor: plan operation failed;

[16060]{-1}[-1/-1] 2015-10-21 14:26:17.056104 w Logger           SavepointImpl.cpp(02149) : NOTE: BACKUP DATA needed to ensure recoverability of the database

[18169]{328227}[22/-1] 2015-10-21 14:26:35.373304 i TraceContext     TraceContext.cpp(00823) : UserName=SYSTEM, ApplicationUserName=xxxxxxxx, ApplicationName=HDBStudio, ApplicationSource=csns.admin.commands.AdministrationHandler$1$1.call(AdministrationHandler.java:338);csns.admin.commands.AdministrationHandler$1$1.call(AdministrationHandler.java:1);java.util.concurrent.FutureTask.run(FutureTask.java:266);java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142);java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617);java.lang.Thread.run(Thread.java:812);

[18169]{328227}[22/-1] 2015-10-21 14:26:35.373288 e PlanViz          PlanVizAction.cc(00045) : PlanVizContext is NULL!!

[18169]{328227}[22/-1] 2015-10-21 14:26:35.373324 e PlanViz          PlanVizAction.cc(00046) : Current session context: systemWatermark=30125,slaveInitCount=-1,version=5,contextId=104427,globalSessionId=328227,anchorGlobalSessionId=328227,version=0,user=SYSTEM,schema=SYSTEM,locale=en_US,collate=BINARY,client=,curr_id_val=-1,app=HDBStudio,app_user=xxxxxxx,dateformat=,reserveprefix=true,ddlautocommit=false,checkPasswordChangeNeeded=false,abapVarcharMode=false,largeNumberOfParametersSupport=false,isFederationCallbackSession=false,associatedConnectionId=0,totalRowCount=0,enableDeferredLobOperation=0,hasStatefulCtxBitmap=4,tmpTableCount=0,transactionIsolationLevel=1

[18169]{328227}[22/-1] 2015-10-21 14:26:35.373356 e PlanViz          PlanVizAction.cc(00047) : Stack trace:

1511995[thr=18169]: SqlExecutor at

1: 0x00007f4867900052 in Execution::ContextFunctions::dumpInfo(Execution::Context&, ltt::basic_ostream<char, ltt::char_traits<char> >&, bool, bool, bool, bool, bool)+0x390 at ContextFunctions.cpp:657 (libhdbbasis.so)

2: 0x00007f485bfd91aa in ptime::PlanVizActionParam::init(ptime::Env const&)+0x506 at PlanVizAction.cc:48 (libhdbrskernel.so)

3: 0x00007f485bf81ce2 in ptime::BuiltinProcedure_PLANVIZ_ACTION::execute(ptime::Env&) const+0x190 at PlanVizAction.h:14 (libhdbrskernel.so)

4: 0x00007f485b53f4f6 in ptime::Proc_call::execute(ptime::Env&) const+0x3a2 at qe_proc_call.cc:268 (libhdbrskernel.so)

5: 0x00007f485b54006c in ptime::Proc_call::operator()(ptime::Env&) const+0x728 at qe_proc_call.cc:141 (libhdbrskernel.so)

6: 0x00007f485be292ae in ptime::Query::_execute(ptime::Transaction&, char const*, ptime::Query::Plan*, ptime::Query::param_t*, ptime::Query::result_t*, bool)+0x5fa at query.cc:5249 (libhdbrskernel.so)

7: 0x00007f485be2f9db in ptime::Query::execute(ptime::Transaction&, char const*, ptime::Query::param_t*, ptime::Query::Plan*, ptime::Query::result_t*, ptime::Statement*, bool)+0x647 at query.cc:701 (libhdbrskernel.so)

8: 0x00007f485ab51b79 in ptime::Statement::execute_(Execution::Context&, bool, bool, bool, bool)+0x355 at Statement.cc:2054 (libhdbrskernel.so)

9: 0x00007f485ab7b03c in ptime::CallableStatement::execute(Execution::Context&, bool, bool, bool, bool, ptime::Statement::BatchProcessingState, bool, bool, bool)+0x588 at CallableStatement.cc:503 (libhdbrskernel.so)

10: 0x00007f4862863c8f in ptime::Session::executeQuery(Execution::Context&, ptime::Action&)+0xdb at sm_session.cc:1357 (libhdbsqlsession.so)

11: 0x00007f4862858ff0 in ptime::SessionHandler::handleEvent(Execution::Context&, ptime::AppEvent*)+0x4f0 at sm_handler.cc:846 (libhdbsqlsession.so)

12: 0x00007f486285a401 in ptime::SessionHandler::receiveMessage(Execution::Context&, ptime::CommEvent*)+0x960 at sm_handler.cc:647 (libhdbsqlsession.so)

13: 0x00007f486287907c in ptime::TcpReceiver::doWork(Execution::Context&, ptime::CommMgr*)+0xd78 at tcp_receiver.cc:505 (libhdbsqlsession.so)

14: 0x00007f4862879b0a in ptime::TcpReceiver::run(void*)+0x1d6 at tcp_receiver.cc:604 (libhdbsqlsession.so)

15: 0x00007f4875feb4d4 in TrexThreads::PoolThread::run()+0x810 at PoolThread.cpp:256 (libhdbbasement.so)

16: 0x00007f4875fecfb0 in TrexThreads::PoolThread::run(void*&)+0x10 at PoolThread.cpp:124 (libhdbbasement.so)

17: 0x00007f4867958439 in Execution::Thread::staticMainImp(void**)+0x875 at Thread.cpp:488 (libhdbbasis.so)

18: 0x00007f4867958ffd in Execution::Thread::staticMain(void*)+0x39 at ThreadMain.cpp:26 (libhdbbasis.so)


Judging from the logs it seems the plan executor fails before the job could be able to be started.


BTW

I can successfully see the controller.jar in hadoop

Hadoop Controller.PNG

Also I have uploaded the jars in the lib folder following the administration guide.

Lib.PNG


I tried different HANAs including SPS09 and SPS10 to perform the integration, all give me the same error.


So my question is has anyone happened to face the same issue?

Or if there is anything wrong with my previous steps?

Thanks in advance.

Out of memory when executing Stored Procedure in SPS09

$
0
0

We have been facing a similar issue after an upgrade to the SPS 10.


Re: Out of Memory when executing Stored Procedures after SPS 09 upgrade


We had to downgrade again to SPS 09 but the issue didn't resolve.

 

The problem manifested in different ways. The most recent one appeared as it follows.

We have a select statement that goes out of memory when running into a store procedure. On the contrary it works fine when executed from SQL console. The select looks like:

 

SELECT [CALCULATION VIEW] WHERE [CLAUSE]

 

We have gone over this plenty of times but still don't understand what is going on here.

According to the PlanViz the engine suddenly starts making outer joins for apparently no reason (find the plv attacched).


Moreover if we get rid of the WHERE clause the execution doesn't run out of memory.

 

We did open an incident, but we are struggling to get an answer.

Replication of Data from ECC/BW on to the Hana.

$
0
0

Hi All,

 

We are implementing   sidecar approach & installed below components.

 

1. HANA Database intallation done.

2. ECC & BW systems went go-live

3. SLT server Installation

 

Applied required add-ons on ECC side and SLT.

 

Can you please let us know know  steps needed for Replication of Data from ECC/BW on to the Hana.

Do we need to run any jobs for the same?

 

Please advice.

 

Regards,

Karthik.


HANA Views Metadata : Creation Date

$
0
0

Hi,

 

I am unable to find creation date of HANA Views. I looked at "SYS"."VIEW" as well as underlying base table "SYS"."RS_VIEWS_" but could not see a any date field. Where do I find information on when a HANA View was created and last modified? 

 

(In case of HANA tables, information is available in column "CREATE_TIME" in view "SYS"."M_RS_TABLES" but not for views )

 

Thanks.

ABAP Schema user is getting locked Automatically

$
0
0

#HANA Gurus



In HANA Studio ABAP Schema user is getting locked automatically.

So I want to know where we can check how the user is getting lock, is there any trace on OS level where we can check this or on HANA side how to find user info.    


Thanks in advance


Regards,

Dhiraj

Error Installing HANA SPS 11

$
0
0

Hi All,

 

Has anyone successfully installed SPS11 using the downloads from the service market place?

 

I have downloaded the files 3 times and each time I am getting CRC errors during the RAR Extraction process. This seems to be leading to issues during my install process.

 

Any feedback would be greatly appreciated.

 

Thanks, Paul

 

Subsequent install issue:

 

Screen Shot 2015-11-26 at 9.24.41 AM.png

SCM migration with DMO

$
0
0

Hello,

 

I need to prepare a SCM migration to Hana Database, and I've been looking for a guide like as First Steps to migrate BW to Hana for SCM System but I can't find anything.

 

Are there a similar guide for SCM migration?

 

Thanks a lot.

Change data source from bex query to HANA models for design studio dashboard

$
0
0

Hello,

 

In present scenario we are using design studio dashboard source as bex query.

 

There is a perfomance issue with bex query.it is taking much time to display output.(BW on Hana)

 

Now we want to create hana models on top of infocues or fact tables and we want to connect that models to existing dashboards as a datasource.

 

please adise on the following.

 

1. can i create the view on facttable or dictly on infocue( it is flatfile datasource,no masterdata)

2. can we directly connect the model or universe to the existing dashboard(design studio)

 

 

i am new to hana and design studoi.

 

please advise.

 

Thank you

Inconsistent SSFS error after SP9 Upgrade

$
0
0

Hi Folks,

 

I am getting the below error after upgrading to SP9 Rev97..

 

backup could not be completed, Inconsistent SSFS!

 

I tried to resolve this with the below note but it didnt help. While executing the ALTER SYSTEM APPLICATION ENCRYPTION CREATE NEW KEY mentioned in the note, it throws error Application encryption not possible without secure store

 

2097613 - Database is running with inconsistent Secure Storage File System (SSFS)

 

Need your help

 

-Avinash

How to declare a variable in sql script

$
0
0

Hi

 

How to declare a variable same as the table's column data type ..

 

Like I have a table employee and salary as a column

 

I want to create a variable salary_column salary as salary .

 

I have no idea what is the data type of salary ...

 

please suggest ..


Mass Migration of User ID and ROles in HANA Studio from ECC

$
0
0

Hello Gurus,

 

A bit of a background , I am developing SAP HANA views, which will be accessed using HANA live Browser or BOBJ.

 

Now the challenge is the organization has thousands or users and roles,

Is there any way to do a mass migration from ECC into SAP HANA Studio.

 

Need your urgent attention on this.

 

Regards,
Vivek

How can I create databasepool in a sap hana xs service?

$
0
0

Hi, everybody, I'm not sure if this is the right forum, but I'm trying ...

 

I'm new to HANA xs service and I'm trying to make a databasepool in a xs service. I've done every thing that is listed in the sap HANA Developers Guide sps10, but still I think I got one connection instead of several connection via the dbpool. Here is how I have done it

 

  1. created a .xssqlcc file
  2. added description in .xssalcc file
  3. added default role in .xssqlcc file
  4. activeded the xssqlcc file in repo
  5. activated the xssqlcc file via hana/xs/admin console
  6. added code in my xs service

             var connDBpool = $.hdb.getConnection({'sqlcc' : '<package:myxssqlcc file>','pool' : true});

    

But still it seems to be only one db connection, then the service is doing several queries at the same time. Have some of you successfully created a dbpool, have I missed something in the configuration of the dbpool?


Do I also need to add some properties to the .xsaccess file? The content of that file is at the moment


{

     "prevent_xsrf": true,

     "authentication": { "method": "Basic"},

     "exposed": true

}


The reasone for why I want a dbpool is because I find it very strange that  the xs service need to make a db - connection for each and every query to the database.

Passing parameters from graphical Calculation view to Calculation view based on script

$
0
0

Hi ,

 

I have read many forums where the calculation view - Graphical is called in the calculation view - SQL and parameters are passed, in my POC i want to try the other way, I want to use the calculation view - SQL in calculation view - Graphical in the projection node and get the desired output.

 

Can you please check the attached document and help me to resolve the issue?

Managed System Configuration for SAP HANA on Solution Manager 7.1

$
0
0

Hello, experts.

 

 

I have a problem on setting up on Managed System Configuration for SAP HANA on SOLMAN71.

 

 

THe environment is

SAP HANA SPS04 revision 28.

Solution Manager 7.1 SR1

 

 

I have already done this 2steps "Preparation of Solution Manager", "Preparation on HANA"

http://scn.sap.com/community/it-management/alm/solution-manager/blog/2012/03/30/configure-hana-db-with-sap-solution-manager-for-rca-tools-and-technical-monitoring.

and at the next step,

I have some troubles.

 

 

 

 

Under the steps Managed Systems Configuration

Technical System -> Configure System ->step 5"Enter Landscape Parameters" -> Landscape Objects

it shows

Installed on path <Not Defined>(HANA Installed Technical System)(1)

and Status is "No parameters" (see the attached file1)

 

 

 

 

Technical System -> Configure System ->step 9"Check Configuration" -> Configuration Check

Message "Managed System ...~HANADB is not configured"

"No Path found in the Installed Technical System" in detail.(see the attached file2)

 

 

---------------------

 

 

I already check following URL. and needed procedure is already done.

http://wiki.sdn.sap.com/wiki/display/SMSETUP/Maintenance+of+Product+in+the+System+Landscape#MaintenanceofProductintheSystemLandscape-SAPHANA

 

 

and

 

 

I tried input the path on Technical System.(see the attached file3,4)

but nothing changed.

 

 

Regards.

 

Third side replication

$
0
0

I’ve a question.

 

Is it possible, to replicate the HANA database a second one?

 

Our business is planned a new datacenter design.

It’s planned to have a local HA solution. But the datacenter for the DR solutions, are more than 100km away.

So we will create for the local HA solution a synchronous replication and for the DR scenario an asynchronous replication.

 

Can me anyone help?

 

Thanks

Michael

Viewing all 5653 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>