Arturia Forums

DRUMS => Spark => Spark Technical Issues => Topic started by: cubaner on July 08, 2011, 03:56:23 pm

Title: Extreme high Asio load!
Post by: cubaner on July 08, 2011, 03:56:23 pm
Hi,

using the demo under Cubase 6.02 with RME 9632 XP Sp3 3GB Ram and quadcore 2,8GHZ

CPU load absolute okay, but the Asio load is a    catastrophic failure.    Strangely enough  the extrem Asio load is, no matter which Buffersize set. High load (means 60%) when using 512 or 2048 is selected. Its a pitty, so for me canīt use Spark with a Host only usable as stand alone. stehen Not to be able to stand comparison with Machine from NI.
Is there an update which will  change this unfortunate circumstance?
All other virtuell instruments didīnt produce so high Asio load.

If there is an acceptabel Asio load i will buy it.
btw.: can i use the controller unit to tweak my other Instruments via midi learn or is there a listing with all controller numbers which are used from the hardware unit?
Can i remote the transportfunction of my host? (Cubase6.02)

Thanks for your answer and your are on a good way , souds grat looks nice, except for the horrible Asio load.

greetings

Title: Re: Extreme high Asio load!
Post by: cubaner on July 10, 2011, 10:08:40 am
Hi,

there is no one out there who had the problem with the high ASIO load?
 I canīt believe it. I need help,please, because i want a buy a beatbox and Spark was my first choice. But after using the Demo i am afraid of the high Asio load. Its not a problem with my system, because all of my other virtuell instruments works fine here and i use a lot.

Is this a well known issue? Are the Arturia team working on this?

Kevin,..no advice or answer?

sorry for my english, maybe it is full of mistakes, but i do my best ;-)


Title: Re: Extreme high Asio load!
Post by: Kevin on July 11, 2011, 10:16:01 am
Hi Cubaner,
sorry for not answering before.
You can trust me that we will do our best to solve this ASAP.
We are working on this but still not able to reproduce it here.
It seems to happens on very specific configurations.
Can you give me some details (screenshots)?
Does it happens on every Spark projects?

Kevin