|
Post by xPsycHoWasPx on Mar 10, 2019 21:24:28 GMT
Hi Everyone. The last 3-4 months i been trying to learn python for “Midi Remote Script” and i gotten really far now. So far that its starting to be become useable and actully pretty cool so far. My goal is the build the ultimate Step Sequencer Device atleast for me. I just love programming drums with step sequencers but its offfen also a so so experience, workflow alike + recall the right midi patterns with cubase / ableton. Orginal it was all based on Lemurs Sequencer obejcts. But after learning how to programm midi remote scripts, then i can just draw the patterns in Ableton with lemur now. Even gotten it to use the browser and load samples to drumpads, and general load devices/effects ect. Very soon im goin to need some beta testers, to get abit feedback from others. A small teaser: Alot have been improved since these videos and i will post some more up 2 date videoes /material. So if anybody from this forum would be interrested, then please reply this thread, with an email i can send Lemur/Live midi remote script to. Im not interrrested in sharing this full public, because i have a vision of maybe rebuilding it as real app, and sell it. Im not trying replace the Push controller or the other Touchable app. But it will my own take on how a modern step sequencer could be like. Because to me Step Sequencer havnt evolved at all, like the only kind of step seq there is, just have 16 steps and some switches to switch between step 1-16 / 17-32 / 33-48 ect And if a sequencer have more than one row of steps, then mostly always 4 x 8 (32steps ) or like push 8x8(64 steps) Mine will feature a layout of 3 x 64 steps(4 bars) Each 64 step is a 4x 16 module. Because of the 4x lines, you can now draw all 4 bars at same speed of 1 bar. It can also control velocity pr step. And very soon it can control offset pr step, so each can be moved for more human feeling. All of this is of course automatic recalled when u select a clip, so no need to press all kind of lemur settings to load stuff out of ableton. I still need to update a few things before its ready, and also need to add a normal style node editor. Later on automation drawing will be implemented 2, but requires the rest works perfect first. That's all for now. Hopefully next week its ready for some public testing.
|
|
|
Post by xPsycHoWasPx on Mar 15, 2019 21:27:34 GMT
Gotten a bit further with the browser part:
|
|
|
Post by dollanaire on Mar 23, 2019 13:34:50 GMT
OMG!! Man I need this!! This looks so awesome. I'd be honored if you could send this to me I would love to test it for you. My email is dollanaire@gmail.com
|
|
|
Post by xPsycHoWasPx on Mar 23, 2019 20:09:50 GMT
OMG!! Man I need this!! This looks so awesome. I'd be honored if you could send this to me I would love to test it for you. My email is dollanaire@gmail.com 1-2 weeks and i should have something that abit more bulletproof ready for testing. It's of course iPad only, since Android use other resolutions than ipad, and the code in the canvas objects isn't designed to be stretched yet. So android will be with black sidebars. Also if you are windows user than I need to know that before I send you, because current state the ableton midi remote script need to access a config file in /Users/Shared/ that tells the script what ip live have to send OSC signals 2. It's for output of live to lemur only. And reason is because OSC can send browser info / midi notes in such greater speed that midi is a slow turtle compared xD also u can send the strings name directly and no need to use ascii to number converter. Input to live still rely on midi sysex, due to the way OSC server in python normally should be executed and can't be done with inside midi remote script, then midi still have a greater advantage since it updates realtime, while osc server will only be able to update every 100ms and only while live is playing. So OSC for output and Midi for input for now. An external python script for converting OSC to midi is in the works and will kill all needs for using lemur daemon since it can create its own midi port xD and will also serve other purposes like receive the current file path of clips and send sample/clip waveforms to lemur. Also plans for adding Mongo or SQL database()in the far future, that will give the ability to organize all your samples and search / load them without having really to move any of the files physical on the hard drive. it's database framework so there no limits to what kind or how many categories u can put your samples into. Since it can store arrays it could also work as a midi sysex preset storage for hardware gear, that can be transmitted to lemur by osc and then lemur can forward them by the midi ports connected to it. Making Live/Lemur into an awesome hardware preset manager 2. + since Live 10, midi remote scripts can be used now in rewire mode, so using this lemur controller won't be limited to using live in standalone mode only, you can easily load live in with Cubase/pro tools/logic etc in rewire mode. That kinda gave me the reason to start building this in the first place.. Lemur control of step sequences, with the power of ableton sequencing inside cubase/studio one....... Do I need to say more? xD (3x64 Step Editor in action - no sound)
|
|
|
Post by dollanaire on Mar 24, 2019 14:51:40 GMT
I can't wait! Yeah I'm a windows user and ipad as well. That step editor is amazing btw. Being able to send sample/clip waveform to lemur is what I've been looking for as well. You are truly on to something my friend!!
|
|
|
Post by xPsycHoWasPx on Mar 28, 2019 19:54:48 GMT
I can't wait! Yeah I'm a windows user and ipad as well. Oki i will make a version of the script that works in windows, should just be a tiny tweak.
That step editor is amazing btw. Hopefully it will be in the end, but again it won't replace a push or the other touchable app, I would still recommend getting push or the other app besides this one. This controller will mainly focus on step sequencing and not soo much about parameter controls and advanced keyboard modes etc. if device parameter controls on touch screen is what u need, I will advice to check into the Touchable app. Since it converter a wide range of advanced parameter controls, even custom looks for parameter controls. Same goes for keyboard modes, if u need to control live notes by real time recording, in a push alike fashion I will advice to buy a push. Being able to send sample/clip waveform to lemur is what I've been looking for as well. That will be a so what limited to this lemur controller only, displaying waveform info is not just something easily done, a Canvas object can only draw 2048 lines and one line pr sample, so a Wavfile will be 44.100 lines pr sec, so to just display 1 sec in 1:1 aspect, it will require a lot of canvas object put together + 8 expressions pr canvas. I made it work and it works ok, but not a trilled experience vs what a real coded app, could display the waveform. so idea is only to show a basic example of waveform display while chopping parts of a sample to a drum pad, and it mostly the reason why it will be added later on.
I will not provide any support for the Ableton Live midi remote script part. meaning that the script supplied will be in *.pyc format that can't be edited, because once I release it, it will work for how its made and nothing else, any mods will properly break it. So for secure reasons I won't supply an uncompiled script that Ableton normally just compile at launch. I know it sounds harsh but in the end, the python script is the whole brain, and gaining knowledge to do all this, took quite a while and not something I just can easy teach to others. And this is mostly for test audience and not a tutorial.
But if u can learn something from the lemur project, then study the project all you want. Right now lemur is still the easiest way to build up my midi ideas and why its in lemur. Once the controller is where I want it 2 be, and the feedback from ppl is enough, I will start converting it all into a real app and sell it on Appstore / Google play / Windows Store.
|
|
|
Post by xPsycHoWasPx on Mar 28, 2019 20:32:23 GMT
Great Great News!!! Ever since I learned how to manipulate sequencer steps in live editor, one thing has puzzled me sooooooooo much.. How do you access a midi clip/audio clip that is in the arranger mode. Normaly when editing the midi clips in ClipSlot part, its just like doing tracks[0].clipslots[0].clip.getnotes(args) if I wanted to get notes from first track and first clipslot.. and needed 1-2 functions to get current track selected and current scene selected if I wanted the script to automatic send what clipslot selected to lemur... and arrangement mode doest have a index like that, also reason push and others doesn't do arrangement mode(maybe it does now, don't know) but FINALLLLLLYYYYYYY
thanks too this silly command
view.detail_clip it now access the clip info based on what is detail view.. soo it just killed the need not 1 not 2 but 3 mini functions.. and..... drum roll..............
Arrangement mode now in the house!!!!!
|
|
|
Post by xPsycHoWasPx on Apr 3, 2019 11:48:23 GMT
Getting sooooo close now for first Release Tests.
Just rounding up last pieces of the missing puzzles and then I will go thru everything to test for various bugs and mistakes and hopefully in a few days its ready for those who wanna help test it.
|
|
|
Post by xPsycHoWasPx on Apr 4, 2019 2:50:24 GMT
Created new Canvas Module to draw automation curves based on single points.
|
|
|
Post by matatablack on Apr 5, 2019 0:34:12 GMT
Hi there! Awesome work, congratulations! A few years ago I wanted to do something similar, also for sequence drum rack and load samples and devices dinamically. Then I needed to work and that situation never stopped, and got more and more time consuming I would love to be a beta tester. my email is matiasm.rodriguez@hotmail.com, im also a developer, from Argentina Greetings and keep going!
|
|
|
Post by kandavu on May 14, 2019 4:41:02 GMT
Heya! That setup is beyond what I’ve seen Lemur can do up till now. Cudos! I’ve several iPads and am on windoze. I’d be keen to test as well. 👍
|
|
|
Post by xPsycHoWasPx on May 25, 2019 12:30:31 GMT
Heya! That setup is beyond what I’ve seen Lemur can do up till now. Cudos!. True, but my live python script is the heart and soul of it all, the lemur is just the current frontend to access it all, like a push controller is really just buttons and lcd display, it won't do anything without the big massive library of scripts they builded for it, those are the true brain. But the way Python scripting works with midi/osc makes u able to build totally unique controls of almost everything, so not just limited to type of midi msgs and OSC but shape them like you need them. But Lemur future is weak, so final product will be a real app. Atm I'm hitting so much limitations, with canvas ect. So in the end it won't be sustainable enough. But lemur right now is still the fastest platform/best to build up midi controller ideas, so for now all testing will be targeted lemur users. Right now I'm trying to finish up the last few things in the sequencer part, and have that ready for test release soon, then browser will follow up shortly. Just hat a lot stuff delaying me real life last +1 month and why still no release, so instead stressing for something I know will still contain bugs, I will release it module by module. Guess it's also easier to get feedback on single modules than a whole app that only is the beginning anyway. But instead of mailing, I created a discord group where all support and releases will be. discord.gg/MbMB32
|
|
|
Post by kandavu on May 25, 2019 14:33:24 GMT
That’s great! Registered.
|
|