Skip to content

Port subscriptions related changes from refactor#13347

Merged
theimpulson merged 2 commits intoTeamNewPipe:devfrom
theimpulson:subscriptions
Mar 16, 2026
Merged

Port subscriptions related changes from refactor#13347
theimpulson merged 2 commits intoTeamNewPipe:devfrom
theimpulson:subscriptions

Conversation

@theimpulson
Copy link
Copy Markdown
Member

What is it?

  • Bugfix (user facing)
  • Feature (user facing) ⚠️ Your PR must target the refactor branch
  • Codebase improvement (dev facing)
  • Meta improvement to the project (dev facing)

Description of the changes in your PR

APK testing

The APK can be found by going to the "Checks" tab below the title. On the left pane, click on "CI", scroll down to "artifacts" and click "app" to download the zip file which contains the debug APK of this PR. You can find more info and a video demonstration on this wiki page.

Due diligence

theimpulson and others added 2 commits March 15, 2026 20:51
Please see TeamNewPipe#11759 for the original change

Signed-off-by: Aayush Gupta <aayushgupta219@gmail.com>
…iptions

* create SubscriptionsImportExportHelper to share common code used in
  SubscriptionFragment and BackupRestoreSettingsFragment
* Add UI options for import/export in BackupRestoreSettingsFragment
@theimpulson theimpulson requested review from Stypox and TobiGr March 15, 2026 12:58
@github-actions github-actions Bot added the size/giant PRs with more than 750 changed lines label Mar 15, 2026
@theimpulson
Copy link
Copy Markdown
Member Author

I don't think the functionality actually needs a dependency upon the rx3 library. I will do a separate PR to address that and then we can merge that in refactor as well.

@TobiGr TobiGr added import/export Anything related to import/export of data and subscriptions feed Issue is related to the feed labels Mar 15, 2026
@theimpulson theimpulson merged commit 668af4f into TeamNewPipe:dev Mar 16, 2026
6 checks passed
@theimpulson theimpulson deleted the subscriptions branch March 16, 2026 01:23
@Stypox
Copy link
Copy Markdown
Member

Stypox commented Apr 26, 2026

These changes broke importing subscriptions on Android, also pinging @Isira-Seneviratne as the original author of the changes in #11759.

The previous code used new StoredFileHelper(..., url, ...).getType() to figure out the mime-type of the input (which internally uses DocumentFile.fromSingleUri(context, uri)). StoredFileHelper is also used to then open the input stream, to maintain compatibility with pre-SAF (StorageAccessFramework) devices.

The current code uses MimeTypeMap.getFileExtensionFromUrl(input.url) to find the mime type, however that always fails because the document URL provided by SAF is just an ID and does not contain the filename or extension (e.g. content://com.android.providers.media.documents/document/document%3G1000051171). Furthermore, the input stream is then opened with applicationContext.contentResolver.openInputStream(), which I guess only works with SAF URLs and not with NNFP (No-Nonsense-File-Picker) URLs, considering that StoredFileHelper.getStream() has a nontrivial implementation.

I would propose restoring the previous file-handling code both on dev and on refactor. The rest of the code structure improvements can definitely be kept (i.e. using a Worker, etc.), but I don't see a reason to ditch StoredFileHelper and reinventing it anyway.

To test this, just create a .csv file with random data, and you will see that the code fails before even trying to read that data with the following exception:

Error while loading subscriptions from path
org.schabi.newpipe.extractor.subscription.SubscriptionExtractor$InvalidSourceException: Not a valid source (Unsupported content type: application/octet-stream)
at org.schabi.newpipe.extractor.services.youtube.extractors.YoutubeSubscriptionExtractor.fromInputStream(YoutubeSubscriptionExtractor.java:67)
at org.schabi.newpipe.local.subscription.workers.SubscriptionImportWorker$loadSubscriptionsFromInput$2.invokeSuspend(SubscriptionImportWorker.kt:127)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:34)

@Stypox
Copy link
Copy Markdown
Member

Stypox commented Apr 26, 2026

Other issues I've found:

  1. The code that loads the subscriptions is broken. If any subscription fails loading (e.g. channel request fails), the whole importing fails. This is especially problematic because channelInfo.tabs[0] may not exist and throw an index out of bounds error.

  2. When importing many subscriptions, the app crashes after 1 minute or so, and this appears:

Unable to start foreground service
android.app.ForegroundServiceStartNotAllowedException: Time limit already exhausted for foreground service type dataSync
  1. When importing many subscriptions, the system goes out of memory, because channels & streams are first all loaded in memory before being saved to the database after they have all loaded.
Work [ id=10d06095-cf5c-453d-9963-68e499696ede, tags={ org.schabi.newpipe.local.subscription.workers.SubscriptionImportWorker } ] failed because it threw an exception/error
java.lang.OutOfMemoryError: Failed to allocate a 32 byte allocation with 1237104 free bytes and 1208KB until OOM, target footprint 201326592, growth limit 201326592; giving up on allocation because <1% of heap free after GC.
  1. Even after changing the code to save to db right away, it still goes out of memory. It only stopped going out of memory when I removed the parallelism (i.e. .limitedParallelism(PARALLEL_EXTRACTIONS)).

So maybe it's better to revert this PR altogether. And to keep things much simpler in general. I don't see a need to fetch streams together with the channel info: the user will just reload the feed if they want to see videos. And storing stuff to database in chunks is not necessary at all imo (db is orders of magnitude faster than internet anyway). Or we could skip loading channel info altogether and just save the channels to the db with the little information provided in the Google takeout (i.e. ID,URL,NAME).

I quickly hacked together something that allowed me to load 340 subscriptions for a friend:
diff --git a/app/src/main/java/org/schabi/newpipe/local/subscription/SubscriptionManager.kt b/app/src/main/java/org/schabi/newpipe/local/subscription/SubscriptionManager.kt
index 5cf378cc3..8d1650259 100644
--- a/app/src/main/java/org/schabi/newpipe/local/subscription/SubscriptionManager.kt
+++ b/app/src/main/java/org/schabi/newpipe/local/subscription/SubscriptionManager.kt
@@ -50,13 +50,13 @@ class SubscriptionManager(context: Context) {
         }
     }
 
-    fun upsertAll(infoList: List<Pair<ChannelInfo, ChannelTabInfo>>) {
+    fun upsertAll(infoList: List<Pair<ChannelInfo, ChannelTabInfo?>>) {
         val listEntities = infoList.map { SubscriptionEntity.from(it.first) }
         subscriptionTable.upsertAll(listEntities)
 
         database.runInTransaction {
             infoList.forEachIndexed { index, info ->
-                val streams = info.second.relatedItems.filterIsInstance<StreamInfoItem>()
+                val streams = info.second?.relatedItems?.filterIsInstance<StreamInfoItem>() ?: listOf()
                 feedDatabaseManager.upsertAll(listEntities[index].uid, streams)
             }
         }
diff --git a/app/src/main/java/org/schabi/newpipe/local/subscription/workers/SubscriptionImportWorker.kt b/app/src/main/java/org/schabi/newpipe/local/subscription/workers/SubscriptionImportWorker.kt
index cc8cf6f24..e33262429 100644
--- a/app/src/main/java/org/schabi/newpipe/local/subscription/workers/SubscriptionImportWorker.kt
+++ b/app/src/main/java/org/schabi/newpipe/local/subscription/workers/SubscriptionImportWorker.kt
@@ -2,6 +2,7 @@ package org.schabi.newpipe.local.subscription.workers
 
 import android.content.Context
 import android.content.pm.ServiceInfo
+import android.net.Uri
 import android.os.Build
 import android.os.Parcelable
 import android.util.Log
@@ -26,7 +27,10 @@ import kotlinx.parcelize.Parcelize
 import org.schabi.newpipe.BuildConfig
 import org.schabi.newpipe.R
 import org.schabi.newpipe.extractor.NewPipe
+import org.schabi.newpipe.extractor.channel.ChannelInfo
+import org.schabi.newpipe.extractor.channel.tabs.ChannelTabInfo
 import org.schabi.newpipe.local.subscription.SubscriptionManager
+import org.schabi.newpipe.streams.io.StoredFileHelper
 import org.schabi.newpipe.util.ExtractorHelper
 
 class SubscriptionImportWorker(
@@ -59,25 +63,39 @@ class SubscriptionImportWorker(
         val qty = subscriptions.size
         var title =
             applicationContext.resources.getQuantityString(R.plurals.load_subscriptions, qty, qty)
+        val subscriptionManager = SubscriptionManager(applicationContext)
 
         val channelInfoList =
             try {
-                withContext(Dispatchers.IO.limitedParallelism(PARALLEL_EXTRACTIONS)) {
-                    subscriptions
-                        .map {
-                            async {
-                                val channelInfo =
-                                    ExtractorHelper.getChannelInfo(it.serviceId, it.url, true).await()
+                withContext(Dispatchers.IO) {
+                    subscriptions.forEach {
+                        var currentName = ""
+                        val res = try {
+                            val channelInfo =
+                                ExtractorHelper.getChannelInfo(it.serviceId, it.url, true).await()
+                            currentName = channelInfo.name
+//                            if (channelInfo.tabs.isEmpty()) null else (channelInfo to null)
+                            try {
                                 val channelTab =
                                     ExtractorHelper.getChannelTab(it.serviceId, channelInfo.tabs[0], true).await()
 
-                                val currentIndex = mutex.withLock { index++ }
-                                setForeground(createForegroundInfo(title, channelInfo.name, currentIndex, qty))
-
                                 channelInfo to channelTab
+                            } catch (e: Exception) {
+                                Log.e(TAG, "Error while loading subscription data", e)
+                                channelInfo to null
                             }
-                        }.awaitAll()
+                        } catch (e: Exception) {
+                            Log.e(TAG, "Error while loading subscription data", e)
+                            null
+                        }
+
+                        val currentIndex = mutex.withLock { index++ }
+                        setForeground(createForegroundInfo(title, currentName, currentIndex, qty))
+
+                        res?.let { subscriptionManager.upsertAll(listOf(res)) }
+                    }
                 }
+                listOf<Pair<ChannelInfo, ChannelTabInfo?>>()
             } catch (e: Exception) {
                 if (BuildConfig.DEBUG) {
                     Log.e(TAG, "Error while loading subscription data", e)
@@ -93,7 +111,6 @@ class SubscriptionImportWorker(
         setForeground(createForegroundInfo(title, null, 0, 0))
         index = 0
 
-        val subscriptionManager = SubscriptionManager(applicationContext)
         for (chunk in channelInfoList.chunked(BUFFER_COUNT_BEFORE_INSERT)) {
             withContext(Dispatchers.IO) {
                 subscriptionManager.upsertAll(chunk)
@@ -123,7 +140,7 @@ class SubscriptionImportWorker(
                         val contentType =
                             MimeTypeMap.getFileExtensionFromUrl(input.url).ifEmpty { DEFAULT_MIME }
                         NewPipe.getService(input.serviceId).subscriptionExtractor
-                            .fromInputStream(it, contentType)
+                            .fromInputStream(it, "text/csv")
                             .map { SubscriptionItem(it.serviceId, it.url, it.name) }
                     }
 

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feed Issue is related to the feed import/export Anything related to import/export of data and subscriptions size/giant PRs with more than 750 changed lines

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants